WO2015099016A1 - 画像処理装置、被写体識別方法及びプログラム - Google Patents
画像処理装置、被写体識別方法及びプログラム Download PDFInfo
- Publication number
- WO2015099016A1 WO2015099016A1 PCT/JP2014/084254 JP2014084254W WO2015099016A1 WO 2015099016 A1 WO2015099016 A1 WO 2015099016A1 JP 2014084254 W JP2014084254 W JP 2014084254W WO 2015099016 A1 WO2015099016 A1 WO 2015099016A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- corresponding point
- feature
- point
- information group
- relative
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 163
- 238000000034 method Methods 0.000 title claims description 69
- 238000004364 calculation method Methods 0.000 claims abstract description 213
- 230000008569 process Effects 0.000 claims description 10
- 230000004048 modification Effects 0.000 description 233
- 238000012986 modification Methods 0.000 description 233
- 238000010586 diagram Methods 0.000 description 61
- 230000000694 effects Effects 0.000 description 59
- 230000008859 change Effects 0.000 description 18
- 238000004458 analytical method Methods 0.000 description 14
- 230000009466 transformation Effects 0.000 description 13
- 230000002093 peripheral effect Effects 0.000 description 7
- 230000007423 decrease Effects 0.000 description 4
- 238000010606 normalization Methods 0.000 description 4
- 230000009467 reduction Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000003860 storage Methods 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/757—Matching configurations of points or features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/35—Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
Definitions
- the present invention is based on the priority claim of Japanese Patent Application No. 2013-268852 (filed on Dec. 26, 2013), the entire contents of which are incorporated herein by reference. Shall.
- the present invention relates to an image processing apparatus, a subject identification method, and a program.
- Patent Document 1 discloses an apparatus for identifying a subject using SIFT (Scale Invariant Feature Transform) feature amounts.
- Patent Document 2 discloses a technique for creating an index of a search target image using a feature amount and calculating an image similarity based on the created index.
- an object of the present invention is to provide an image processing apparatus, a subject identification method, and a program that contribute to suppressing a reduction in subject identification accuracy even when there is an error in the feature point association. To do.
- an image processing apparatus having the following functions.
- the image processing apparatus detects one or more first feature points from the first image, and each of the first feature points from an area in a predetermined range including the detected first feature points.
- a first local feature quantity generation unit that calculates a first local feature quantity information group corresponding to Furthermore, the corresponding point calculation which calculates the correspondence between the first feature point and the second feature point included in the second local feature amount information group calculated from the second image as corresponding point information is provided. It is done.
- the scale of the first feature point and the second feature point A corresponding point relative scale calculation unit that calculates a relative relationship with the scale as corresponding point relative scale information is provided. Further, clustering at least one feature point of the first feature point and the second feature point based on the corresponding point relative scale information, and selecting at least one feature point based on the clustering result A point selection unit is provided. Furthermore, a determination unit that compares the first image with the second image for each cluster and determines the identity of the subject based on the feature point selected by the corresponding point selection unit is provided.
- a subject identification method is provided.
- the method One or more first feature points are detected from the first image, and a first range corresponding to each first feature point is detected from a predetermined range including the detected first feature points.
- the method clusters at least one feature point of the first feature point and the second feature point based on the corresponding point relative scale information, and at least one feature point based on a clustering result.
- a program that is executed by a computer that controls an image processing apparatus.
- the program One or more first feature points are detected from the first image, and a first range corresponding to each first feature point is detected from a predetermined range including the detected first feature points.
- the program clusters at least one feature point of the first feature point and the second feature point based on the corresponding point relative scale information, and at least one feature point based on the clustering result.
- the program can be recorded on a computer-readable storage medium.
- the storage medium may be non-transient such as a semiconductor memory, a hard disk, a magnetic recording medium, an optical recording medium, or the like.
- the present invention can also be embodied as a computer program product.
- an image processing apparatus a subject identification method, and a program that contributes to suppressing a reduction in subject identification accuracy even when there is an error in feature point association.
- FIG. 10 is a flowchart illustrating the operation of the second embodiment.
- 10 is a block diagram showing a configuration of Embodiment 3.
- FIG. 10 is a flowchart showing the operation of the third embodiment.
- 10 is a block diagram showing a configuration of Modification 1.
- FIG. 10 is a flowchart showing the operation of Modification 1.
- FIG. 10 is a block diagram illustrating a configuration of a fourth embodiment. 10 is a flowchart showing the operation of the fourth embodiment.
- FIG. 10 is a block diagram showing a configuration of Modification 2.
- FIG. 10 is a flowchart showing the operation of Modification 2.
- FIG. 10 is a block diagram illustrating a configuration of a fifth embodiment. 10 is a flowchart showing the operation of the fifth embodiment. 10 is a block diagram showing a configuration of Modification 3.
- FIG. 10 is a flowchart showing the operation of Modification 3.
- FIG. 10 is a block diagram showing a configuration of Modification 4.
- 10 is a flowchart showing the operation of Modification 4.
- 10 is a block diagram showing a configuration of Modification 5.
- FIG. 10 is a flowchart showing the operation of Modification 5.
- FIG. 10 is a block diagram illustrating a configuration of a sixth embodiment. 10 is a flowchart showing the operation of the sixth embodiment.
- FIG. 10 is a block diagram showing a configuration of Modification 6.
- FIG. 10 is a flowchart showing the operation of Modification 6.
- 10 is a block diagram showing a configuration of Modification Example 7.
- FIG. 10 is a flowchart showing the operation of Modification 7.
- 10 is a block diagram showing a configuration of Modification Example 8.
- FIG. 10 is a flowchart showing the operation of Modification 8.
- 10 is a block diagram showing a configuration of Modification 9.
- FIG. 10 is a flowchart showing the operation of Modification 9.
- 10 is a block diagram showing a configuration of Modification Example 10.
- FIG. 10 is a flowchart showing the operation of Modification 10;
- 10 is a block diagram showing a configuration of Modification 11.
- FIG. 14 is a flowchart showing the operation of Modification 11.
- FIG. 10 is a block diagram showing a configuration of Modification 12.
- FIG. 14 is a flowchart showing the operation of Modification 12.
- 14 is a block diagram showing a configuration of Modification Example 13.
- FIG. 16 is a flowchart showing the operation of Modification Example 13.
- 16 is a block diagram showing a configuration of Modification Example 14.
- FIG. 16 is a flowchart showing the operation of Modification Example 14.
- FIG. 10 is a block diagram illustrating a configuration of a seventh embodiment. 18 is a flowchart showing the operation of the seventh embodiment. 16 is a block diagram showing a configuration of Modification 15.
- FIG. 16 is a flowchart showing the operation of Modification 15.
- 22 is a block diagram showing a configuration of Modification Example 16.
- FIG. 18 is a flowchart showing the operation of Modification Example 16.
- FIG. 10 is a block diagram showing a configuration of Modification 12.
- FIG. 14 is a flowchart showing the operation of Modification 12.
- 14 is a block diagram
- FIG. 20 is a block diagram showing a configuration of Modification Example 17.
- 18 is a flowchart showing the operation of Modification Example 17.
- 22 is a block diagram showing a configuration of Modification Example 18.
- FIG. 21 is a flowchart showing the operation of modification 18;
- 22 is a block diagram showing a configuration of Modification 19.
- FIG. 21 is a flowchart showing the operation of Modification 19.
- FIG. 22 is a block diagram showing a configuration of Modification 20.
- 22 is a flowchart showing the operation of Modification 20.
- 22 is a block diagram showing a configuration of Modification 21.
- FIG. 22 is a flowchart showing the operation of Modification 21.
- 22 is a block diagram showing a configuration of Modification 22.
- FIG. 22 is a flowchart showing the operation of a modification 22.
- FIG. 22 is a block diagram showing a configuration of Modification 23.
- FIG. 22 is a flowchart showing the operation of Modification 23. It is a figure which shows an example of the scale value of a feature point. It is a figure which shows an example of distribution of the normalization scale value of the feature point of an image. It is a figure which shows an example of the direction of a feature point.
- the image processing apparatus 10 shown in FIG. 1 includes a first local feature quantity generation unit 11, a corresponding point calculation unit 13, a corresponding point relative scale calculation unit 14, a corresponding point selection unit 15, and a determination unit 16.
- the image processing apparatus 10 includes a first local feature quantity generation unit 11, a corresponding point calculation unit 13, a corresponding point relative scale calculation unit 14, a corresponding point selection unit 15, and a determination unit 16.
- the first local feature quantity generator 11 detects one or more first feature points from the first image. And the 1st local feature-value production
- Corresponding point calculation unit 13 calculates the correspondence between the first feature point and the second feature point included in the second local feature amount information group calculated from the second image as corresponding point information.
- the corresponding point information is information indicating which feature point of the second image corresponds to an arbitrary feature point of the first image.
- the corresponding point relative scale calculation unit 14 calculates the first feature point scale and the second feature based on the first local feature amount information group, the second local feature amount information group, and the corresponding point information.
- the relative relationship with the point scale is calculated as corresponding point relative scale information.
- the corresponding point selection unit 15 clusters at least one feature point of the first feature point and the second feature point based on the corresponding point relative scale information, and selects at least one feature point based on the clustering result. To do. Therefore, the corresponding point selection unit 15 selects feature points clustered into clusters that satisfy a predetermined condition based on the relative relationship of the feature point scales. Therefore, even if the corresponding point calculation unit 13 calculates an incorrect correspondence, the corresponding point selection unit 15 can exclude feature points of the incorrect correspondence.
- the determination unit 16 compares the first image and the second image for each cluster based on the feature points selected by the corresponding point selection unit 15.
- the image processing apparatus 10 can eliminate the feature points having the wrong correspondence relationship, it can prevent the subject from being identified based on the wrong feature points. Therefore, the image processing apparatus 10 contributes to suppressing a decrease in subject identification accuracy even when there is an error in the feature point association.
- FIG. 2 is a block diagram illustrating a functional configuration of the image processing apparatus 20 according to the first embodiment.
- the image processing apparatus 20 includes a first local feature value generation unit 201, a second local feature value generation unit 202, a corresponding point calculation unit 203, and a corresponding point relative scale calculation unit 204.
- the corresponding point selection unit 205 and the determination unit 206 are configured.
- FIG. 2 is not intended to limit the image processing apparatus 20 to the configuration shown in FIG. In the following description, it is assumed that the first image includes one or more same or similar subjects, and the second image includes one subject. However, this is not intended to limit the first image and the second image to this condition.
- the first local feature value generation unit 201 detects the number of feature points satisfying a predetermined condition from the first image, and generates a local feature value of a peripheral region (neighboring region) including each detected feature point.
- the predetermined condition may be that the number of feature points exceeds a predetermined threshold.
- the first local feature quantity generation unit 201 outputs a first scale information group composed of the scale values of the detected feature points to the corresponding point relative scale calculation unit 204. And the 1st local feature-value production
- the scale value of the feature point is, for example, information on the size associated with each feature point, and the value is calculated from the image of the peripheral area including the feature point.
- the scale value of the feature point desirably has a property that the value changes in accordance with the enlargement or reduction of the surrounding area including the feature point.
- the change according to the enlargement or reduction of the surrounding area may be, for example, linear change, non-linear change, logarithmic change, or exponential change. Also good.
- the second local feature quantity generation unit 202 performs the same operation as the first local feature quantity generation unit 201, and a second local feature quantity group configured by local feature quantities of each feature point of the second image And a second scale information group composed of the scale values of the feature points of the second image. Then, the second local feature value generation unit 202 outputs the second local feature value group to the corresponding point calculation unit 203. Further, the second local feature quantity generation unit 202 outputs the second scale information group to the corresponding point relative scale calculation unit 204.
- local feature amounts of each feature point of the second image are generated in advance and stored in a database or the like, and created in advance instead of the second local feature amount generation unit 202.
- a local feature group stored in the database may be used.
- the corresponding point calculation unit 203 obtains the first local feature quantity information group output from the first local feature quantity generation unit 201 and the second local feature quantity group output from the second local feature quantity generation unit 202.
- the number of corresponding point information satisfying a predetermined condition is generated.
- the predetermined condition may be that the number of corresponding point information exceeds a predetermined threshold.
- the corresponding point calculation unit 203 calculates a distance between local feature amounts between an arbitrary feature point of the first image and an arbitrary feature point of the second image. Then, the corresponding point calculation unit 203 sets the correspondence relationship between the first local feature quantity information group and the second local feature quantity information group based on the calculated distance between the local feature quantities to satisfy a predetermined condition, calculate.
- the Euclidean distance may be used as the distance between feature points.
- the feature point with the smallest distance value may be calculated as the corresponding feature point.
- the presence or absence of a correspondence relationship may be determined using the ratio between the smallest distance value and the second smallest distance value as an evaluation scale.
- the feature point detection method is preferably determined as appropriate, and is not intended to be limited to the method exemplified above.
- the corresponding point calculation unit 203 outputs a corresponding point information group configured by the calculated correspondence relationship to the corresponding point relative scale calculation unit 204.
- the corresponding point relative scale calculation unit 204 includes a corresponding point information group output from the corresponding point calculation unit 203, a first scale information group output from the first local feature value generation unit 201, and a second local feature value generation. Using the second scale information group output by the unit 202, the relative relationship between the scale values of the feature points associated with each other (hereinafter referred to as corresponding points) is calculated beyond the number that satisfies a predetermined condition. To do. Then, the corresponding point relative scale calculation unit 204 outputs a corresponding point relative scale information group configured by the calculated relative relationship to the corresponding point selection unit 205.
- the corresponding point relative scale may be, for example, a ratio of scale values of corresponding points.
- the formula 1 may be used for the calculation of the ratio of the scale value.
- ⁇ is the scale ratio of the corresponding point
- s (q) is the scale value of the qth feature point detected from the first image
- s (p) is the pth feature detected from the second image.
- the scale ratio may be calculated using the logarithmic value of the scale value of the feature point.
- the corresponding point relative scale may be, for example, a difference value between the scale values of the corresponding points.
- the formula 2 may be used. [Equation 2]
- ⁇ ′ is the difference value of the scale values of the corresponding points
- s ′ (q) is the scale value of the q-th feature point detected from the first image
- s ′ (p) is from the second image. Indicates the scale value of the detected p-th feature point.
- FIG. 62 shows an example in which the subject of the image 10001 is shown in the image 10002 with a double size, and the size of the arrow indicates the size of the scale value.
- the scale value of the feature point has a property of relatively changing according to the subject size in the image. That is, as shown in FIG. 62, the scale value of the feature point has a property that, for example, when the subject size in the image is doubled, it is doubled.
- the scale value of the feature point has a property that, for example, when the subject size is 0.5 times, the scale value is 0.5 times. Therefore, the corresponding point relative scale calculated in Equations 1 and 2 has a characteristic that it is constant for each subject when the feature points of the first image and the feature points of the second image are correctly associated.
- the corresponding point selection unit 205 clusters the feature points included in the first image using the corresponding point relative scale information group output by the corresponding point relative scale calculation unit 204. Then, the corresponding point selection unit 205 selects feature points based on the clustering result. Specifically, for example, if the number of corresponding points included in an arbitrary cluster is equal to or greater than a predetermined threshold, the corresponding point selection unit 205 may select feature points included in the cluster.
- the corresponding point selection unit 205 outputs a selection information group including the corresponding point information group of the selected feature point and the cluster information to the determination unit 206.
- the distance between the corresponding point relative scale of any feature point associated in the images included in the first image and the center of gravity of each cluster is calculated, and the calculated distance is A method of classifying the feature points into the smallest cluster may be used.
- a feature point having a corresponding point relative scale similarity less than a predetermined threshold is included in an arbitrary cluster, for example, a feature point having a corresponding point relative scale having a long distance to the cluster centroid, What is necessary is just to exclude from the said cluster and classify into another cluster.
- the distance between the cluster centroid and the corresponding point relative scale for example, the Euclidean distance may be used, the Mahalanobis distance may be used, or the city distance may be used.
- a method of calculating distances between all values and clustering the calculated distances by graph cut may be used.
- a graph is generated in which each feature point included in the first image is associated with each other as a node and the distance between the feature points is an edge between the nodes.
- a normalized cut which is a known method may be used, or a Markov-Cluster algorithm may be used. The details of the normalized cut and Markov-Cluster algorithm are omitted here.
- a k-means method that is a known method may be used, an LBG (Linde-Buzo-Gray) method, or an LBQ (Learning Bayesian Quantization) method may be used. Also good. The details of the k-means method, the LBG method, and the LBQ method are omitted here.
- the feature points included in the region are counted for each analysis region of an arbitrary size. If the count value is equal to or greater than a predetermined threshold, the feature points included in the region are the same. You may use the method of classifying into a cluster.
- a method of dividing the first image into grids of an arbitrary size and using each grid as an analysis area may be used.
- the analysis areas may or may not overlap, for example.
- the size of the analysis region may be fixed or variable, for example. In the case of variable, for example, the smaller the distance between the center of the analysis region and the image center, the smaller the analysis region size, and the larger the distance between the center of the analysis region and the image center, the larger the analysis region size. You may use the method of.
- a method of classifying feature points included in an analysis region whose count value is equal to or greater than a predetermined threshold may be used in the same cluster, or included in the region and the surrounding analysis region.
- a method of clustering feature points to be clustered into the same cluster may be used.
- analysis areas having a count value equal to or greater than a predetermined threshold are adjacent or overlapping, for example, a method of clustering feature points included in these analysis areas into the same cluster may be used, Clustering that classifies into different clusters may be used.
- the determination unit 206 uses the selection information group output from the corresponding point selection unit 205 to determine subjects that are identical or similar between images in units of clusters. For example, when the number of selected feature points exceeds a certain threshold, it may be determined that the target cluster and the second image are the same (or similar) subject. Then, the determination unit 206 outputs the determination result.
- the determination unit 206 may determine that the target cluster and the second image are the same (or similar) subject when the number of selected feature points exceeds a certain threshold.
- the determination unit 206 may determine the identity or similarity by performing geometric verification using the obtained correspondence between the feature points between the images. For example, assuming that the geometrical relationship between the coordinate values of corresponding points is projective transformation (homography), projective transformation parameters are estimated using a robust estimation method, and the input correspondences with respect to the estimated parameters are out of correspondence. By determining the value, identity or similarity may be determined based on the number of outliers.
- the robust estimation method for example, RANSAC (RANdom Sample Consensus), a least square method, or the like may be used, and details of the robust estimation method are not limited.
- the determination unit 206 estimates, for example, rotation or scale change from the coordinate value of the corresponding point pair including two corresponding points belonging to the target cluster, and based on the difference value between the corresponding point relative scale and the corresponding point direction information.
- the rotation and the scale change are estimated, the rotations and the scale changes calculated by the respective methods are compared, and if the similarity between them is sufficiently large, it may be determined that the target cluster and the second image are the same.
- a pair of corresponding points consisting of two corresponding points is taken from the target cluster, two line points having two feature points belonging to image 1 of the corresponding point pair as end points, and two pairs of corresponding points belonging to image 2 Compared with the line segment with the feature point as an end point, the rotation and scale change of the two line segments are calculated, Compare. When comparing, either or both of rotation and scale change may be used. In addition, since the rotation and scale change values calculated based on the difference value between the corresponding point relative scale and the corresponding point direction information from the corresponding point pair are obtained, each value may be compared independently. You may compare using either value, and you may compare using the average value.
- the same determination may be made based on a corresponding point pair made up of any two corresponding points belonging to the target cluster, or N (N> 1) corresponding point pairs from the target cluster.
- N may be all corresponding point pairs belonging to the target cluster, or may be some corresponding point pairs.
- the determination unit 206 estimates, for example, rotation or scale change from the coordinate values of two or more corresponding points belonging to the target cluster, and rotates or scales based on the difference value between the corresponding point relative scale and the direction information of the corresponding points.
- the target cluster and the second image may be determined to be the same.
- N corresponding point pairs are taken from the target cluster, each corresponding point pair has a line segment whose end points are two feature points belonging to image 1, and two corresponding feature points belonging to image 2
- the rotation and scale change calculated based on the difference value between the corresponding point relative scale and the direction information of the corresponding point is calculated. Compare. When comparing, either or both of rotation and scale change may be used. Also, when calculating the rotation and scale change of a line segment from N corresponding point pairs, the respective average values may be used, the median value, or the value calculated from any corresponding point pair. good.
- the average value of the difference values between the N corresponding point relative scales and the corresponding point direction information may be used.
- a median value or a value calculated from any one corresponding point may be used.
- N may be all corresponding point pairs belonging to the target cluster, or may be some corresponding point pairs.
- the determination unit 206 for example, the corresponding point estimated based on the geometric transformation model of the coordinate value of the corresponding point and the difference value of the corresponding point relative scale and the direction information of the corresponding point estimated by the robust estimation method. If the similarity between the two geometric transformation models is sufficiently large, the target cluster and the second image may be determined to be the same.
- a similarity transformation model, an affine transformation model, or a projective transformation model may be used as the geometric transformation model of the coordinate value of the corresponding point.
- Equation 3 In order to estimate the geometric transformation model of the corresponding point based on the corresponding point relative scale and the difference value of the corresponding direction information, for example, the following equation may be used.
- ⁇ ij represents a corresponding point relative scale value including the scale value of the i-th feature point of the first image and the scale value of the j-th feature point of the second image.
- [rho ij indicates the difference value between the direction information of the feature point i and the feature point j.
- the similarity between two geometric transformation models for example, among the parameters of the estimated transformation model, if the sum of the distance values between the parameters relating to the scale and the rotation is less than the threshold value, it may be judged that they are the same. If all the distance values between them are less than the threshold value, they may be determined to be the same.
- FIG. 3 is a flowchart showing the operation of this embodiment.
- the first local feature generating unit 201 detects the number of feature points satisfying a predetermined condition from the first image, and the second local feature generating unit 202 satisfies the predetermined condition from the second image. A number of feature points are detected (step S301).
- the first local feature value generation unit 201 and the second local feature value generation unit 202 generate local feature values and scale information from the coordinate values of each feature point (step S302).
- the corresponding point calculation unit 203 corresponds to the feature points between the images based on the distance between the arbitrary local feature amount of the first local feature amount information group and the arbitrary local feature amount of the second local feature amount group. A relationship is obtained (step S303).
- the corresponding point relative scale calculation unit 204 calculates a relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group (step S304).
- the corresponding point selection unit 205 clusters the feature points based on the corresponding point relative scale, and selects the feature points based on the clustering result (step S305).
- the determination unit 206 determines similarity or identity between images based on the number of selected feature points in units of clusters (step S306).
- the image processing apparatus 10 clusters feature points based on the corresponding point relative scale information and determines the same or similar subject between images in units of clusters. Since the relative scale between the feature points correctly associated between the images is constant for each subject, the feature points of the first image and the second image are clustered by clustering the feature points based on the corresponding point relative scale information. It is possible to delete an incorrect association with the feature point of the image. Thereby, the same or similar subject can be accurately identified between images.
- FIG. 4 is a block diagram illustrating a functional configuration of the image processing apparatus 40 according to the second embodiment.
- the image processing apparatus 40 has the same configuration as the image processing apparatus 20 shown in FIG. 2, and the configuration and operation of the corresponding point selection unit 401 are different.
- the corresponding point selection unit 401 clusters the feature points included in the first image using the corresponding point relative scale information group output from the corresponding point relative scale calculation unit 204 and the relative scale range.
- the clustering of feature points and the selection of corresponding points may use the same method as in the first embodiment, and a detailed description thereof is omitted.
- the relative scale range may be, for example, a value based on experience or observation, a value learned mechanically using learning data, or a value calculated from the corresponding point relative scale information group by unsupervised clustering.
- the relative scale range may be one or two or more.
- the corresponding point selection unit 401 has, for example, feature points having a relative scale value distributed within the relative scale range. Can be classified into the same cluster.
- the clustering of feature points and the selection of corresponding points may use the same method as in the first embodiment, and a detailed description thereof is omitted.
- FIG. 5 is a flowchart showing the operation of this embodiment. The operation of this embodiment will be described in detail with reference to FIG. Note that steps S501 to S503 and S506 shown in FIG. 5 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 calculates the relative scale based on the first scale information group, the second scale information group, and the corresponding point information group. The relationship is calculated (step S504).
- the corresponding point selection unit 401 clusters the feature points based on the corresponding point relative scale and the relative scale range, and selects the feature points based on the clustering result (step S505). And it changes to step S506 shown in FIG.
- FIG. 6 is a block diagram illustrating a functional configuration of the image processing apparatus 60 according to the third embodiment. As shown in FIG. 6, the image processing apparatus 60 has the same configuration as the image processing apparatus 20 shown in FIG. 2, and the configuration and operation of the corresponding point normalized relative scale calculation unit 601 are different.
- the corresponding point normalized relative scale calculation unit 601 includes a corresponding point information group output from the corresponding point calculation unit 203, a first scale information group output from the first local feature quantity generation unit 201, and a second local feature. Using the second scale information group output by the quantity generation unit 202, the corresponding point relative scale, which is the relative relationship of the scale values of the corresponding points, is calculated beyond the number that satisfies a predetermined condition.
- the corresponding point normalized relative scale calculation unit 601 normalizes the calculated corresponding point relative scale based on the normalized value, and the corresponding point normalized relative scale information configured by the normalized corresponding point relative normalized scale. Create a group.
- the generated corresponding point normalized relative scale information group is output to the corresponding point selection unit 205.
- a method similar to that of the corresponding point relative scale calculation unit 204 according to the first embodiment may be used.
- Equation 4 For normalization of the corresponding point relative scale, for example, an equation of Formula 4 may be used.
- the ⁇ normal value is a normalized corresponding point relative scale value
- ⁇ (n) is the nth corresponding point (for example, the qth feature point of the first image and the pth feature point of the second image).
- the relative scale value of the feature point), z indicates a normalized value.
- the normalized value may be, for example, the size of the second image, the actual size of the subject in the second image, or the size of the subject in the second image.
- the size of the image may be, for example, the width of the image, the height of the image, the number of pixels of the image, or the aspect ratio of the image.
- the actual size of the subject may be, for example, the actual width or the actual height.
- the size of the subject may be, for example, the width of the subject in the image, the height, the number of pixels, or the aspect ratio.
- FIG. 63 An example of the corresponding point normalized relative scale value is shown in FIG.
- the horizontal axis in FIG. 63 represents the logarithm value of the normalized relative scale value, and the vertical axis represents the frequency.
- the corresponding point selection unit 205 may perform clustering so that feature points having a relative scale value equal to or higher than a predetermined frequency are classified into the same cluster, for example. For clustering, for example, the same method as in the first embodiment may be used.
- FIG. 7 is a flowchart showing the operation of this embodiment. The operation of this embodiment will be described in detail with reference to FIG. Note that steps S701 to S703 and S706 shown in FIG. 7 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 After executing the processing of step S701 to step S703 shown in FIG. 7, the corresponding point normalized relative scale calculation unit 601 performs the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group. And the calculated relative scale relationship is normalized based on the image size or the actual subject size (step S704).
- the corresponding point selection unit 205 clusters the feature points based on the corresponding point normalized relative scale information group, and selects the feature points based on the clustering result (step S705). And it changes to step S706 shown in FIG.
- the same effect as in the first embodiment can be obtained, and when the relative scale between the feature points correctly associated between the images is normalized, the normalized corresponding point relative scale is obtained from the first image. It is constant for all subjects. Therefore, by clustering the feature points based on the normalized relative scale values of the corresponding points, for example, the feature points of subjects having the same pattern and different sizes such as logos are associated with each other in the first image. An erroneous association between the feature point and the feature point of the second image can be deleted. Thereby, even if subjects having different sizes between images are included, they can be identified more accurately than in the first embodiment.
- FIG. 10 is a block diagram illustrating a functional configuration of the image processing apparatus 100 according to the fourth embodiment.
- the image processing apparatus 100 includes a first local feature value generation unit 1001, a second local feature value generation unit 1002, a corresponding point calculation unit 203, and a corresponding point relative direction calculation unit 1003.
- the corresponding point selection unit 1004 and the determination unit 206 are configured.
- the operations of the corresponding point calculation unit 203 and the determination unit 206 are the same as those in the first embodiment, description thereof is omitted.
- the first local feature quantity generation unit 1001 detects the number of feature points satisfying a predetermined condition from the first image. And the 1st local feature-value production
- the first local feature quantity generation unit 1001 calculates direction information of each feature point detected from the first image. Then, the first local feature quantity generation unit 1001 outputs the first direction information group configured by the calculated direction information of the feature points to the corresponding point relative direction calculation unit 1003.
- the direction information of the feature points is, for example, information on the direction (angle) associated with each feature point, and the value is calculated from the image of the peripheral area including the feature point. It is desirable that the direction information of the feature point has a property that, for example, when the image of the peripheral area including the feature point is rotated, the angle is rotated accordingly.
- the first local feature value generation unit 1001 may calculate the direction of the luminance gradient of the peripheral area including the feature point as the direction information of the feature point.
- the first local feature quantity generation unit 1001 may calculate the change in luminance in the divided patch area by dividing the peripheral area including an arbitrary feature point into 4 ⁇ 4 patch areas as the direction information of the feature points. good.
- the direction in which the calculated luminance change is the largest may be calculated as the feature point direction information.
- the first local feature quantity generation unit 1001 may calculate a direction in which the strength of the weblet coefficient is large as feature point direction information using a weblet filter.
- the second local feature quantity generation unit 1002 includes a second local feature quantity group configured by local feature quantities of each feature point of the second image, by the same operation as the first local feature quantity generation unit 1001. A second direction information group composed of the direction information of each feature point of the second image may be generated. Then, the second local feature value generation unit 1002 outputs the second local feature value group to the corresponding point calculation unit 203. Further, the second local feature quantity generation unit 1002 outputs the second direction information group to the corresponding point relative direction calculation unit 1003.
- the corresponding point relative direction calculation unit 1003 includes a corresponding point information group output from the corresponding point calculation unit 203, a first direction information group output from the first local feature value generation unit 1001, and a second local feature value generation. Using the second direction information group output by the unit 1002, the relative relationship of the direction information of the corresponding points (hereinafter referred to as the corresponding point relative direction information) is calculated beyond the number satisfying a predetermined condition. Then, the corresponding point relative direction calculation unit 1003 outputs a corresponding point relative direction information group composed of the calculated corresponding point relative direction information to the corresponding point selection unit 1004.
- the corresponding point relative direction calculation unit 1003 may calculate the difference value of the corresponding point direction information as the corresponding point relative direction information.
- the equation 5 may be used to calculate the difference value of the direction information of the corresponding points.
- ⁇ represents the difference value of the direction information of the corresponding point.
- ⁇ (q) indicates the direction information of the qth feature point detected from the first image.
- ⁇ (p) indicates the direction information of the p-th feature point detected from the second image.
- the corresponding point relative direction may be, for example, the ratio of the direction information of the corresponding points.
- an equation of Equation 6 may be used.
- Equation 6 Equation 6
- ⁇ ′ indicates the ratio of direction information of corresponding points.
- FIG. 64 shows an example in which the subject of the image 10012 is rotated 45 degrees with respect to the subject of the image 10011.
- the direction of the arrow in FIG. 64 indicates the direction information of the feature point.
- the feature point direction information has a property of relatively changing according to the rotation of the subject in the image. That is, when the subject in the image rotates 45 degrees, the direction information of all the feature points of the subject rotates 45 degrees. For this reason, the relative direction of the corresponding points calculated by Equations 5 and 6 has a characteristic of being constant for each subject when the feature points of the first image and the feature points of the second image are correctly associated.
- the corresponding point selection unit 1004 clusters the feature points included in the first image using the corresponding point relative direction information group output by the corresponding point relative direction calculation unit 1003. Then, the corresponding point selection unit 1004 selects feature points based on the clustering result, and outputs a selection information group. Since the selection of the feature points and the output of the selection information group are the same as those in the above embodiment, detailed description thereof is omitted.
- the corresponding point selection unit 1004 may perform clustering so that feature points having a high degree of similarity in the relative direction of the corresponding points (small distance values) are classified into the same cluster.
- clustering of feature points for example, a method similar to that in which the corresponding point selection unit 205 of the first embodiment clusters feature points based on the corresponding point relative scale may be used.
- selecting the corresponding points for example, if the number of feature points included in an arbitrary cluster is equal to or greater than a predetermined threshold value, the feature points included in the cluster may be selected.
- FIG. 11 is a flowchart showing the operation of this embodiment. The operation of this embodiment will be described in detail with reference to FIG. Note that steps S1101 to S1103 and S1106 shown in FIG. 11 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative direction calculation unit 1003 After executing the processing of steps S1101 to S1103 shown in FIG. 11, the corresponding point relative direction calculation unit 1003 performs corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group. Is calculated (step S1104).
- the corresponding point selection unit 1004 clusters the feature points based on the corresponding point relative direction information group, and selects the feature points based on the clustering result (step S1105). And it changes to step S1106 shown in FIG.
- the fifth embodiment will be described in detail.
- the corresponding point relative scale information and the corresponding point relative direction information are used for clustering.
- FIG. 14 is a block diagram illustrating a functional configuration of the image processing apparatus 140 according to the fifth embodiment.
- the image processing apparatus 140 includes a first local feature value generation unit 1401, a second local feature value generation unit 1402, a corresponding point calculation unit 203, and a corresponding point relative scale calculation unit 204.
- the corresponding point relative direction calculation unit 1003, the corresponding point selection unit 1403, and the determination unit 206 are configured.
- the configurations and operations of the corresponding point calculation unit 203, the corresponding point relative direction calculation unit 1003, and the determination unit 206 are the same as those in the above-described embodiment, the description thereof is omitted.
- the first local feature value generation unit 1401 generates a first local feature value group and a first scale information group. Then, the first local feature value generation unit 1401 outputs the generated first local feature value group to the corresponding point calculation unit 203, and the first local feature value generation unit 1401 generates the generated first Are output to the corresponding point relative scale calculation unit 204. Further, the first local feature value generation unit 1401 outputs the first direction information group to the corresponding point relative direction calculation unit 1003.
- the second local feature value generation unit 1402 generates a second local feature value group, a second scale information group, and a second direction information group. Then, the second local feature value generation unit 1402 outputs the second local feature value group to the corresponding point calculation unit 203. Furthermore, the second local feature value generation unit 1402 outputs the second scale information group to the corresponding point relative scale calculation unit 204. Further, the second local feature value generation unit 1402 outputs the second direction information group to the corresponding point relative direction calculation unit 1003.
- the corresponding point selection unit 1403 uses the corresponding point relative scale information group output from the corresponding point relative scale calculation unit 204 and the corresponding point relative direction information group output from the corresponding point relative direction calculation unit 1003 to generate the first image. Cluster the feature points included in.
- the corresponding point selection unit 1403 may perform clustering so that feature points whose similarity in the corresponding point scale information and the similarity in the relative direction of the corresponding points exceed a predetermined threshold are classified into the same cluster.
- the same method as in the first embodiment may be used.
- FIG. 15 is a flowchart showing the operation of this embodiment. The operation of this embodiment will be described in detail with reference to FIG. Note that steps S1501 to S1503 and S1506 shown in FIG. 15 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 calculates the relative scale based on the corresponding point information group, the first scale information group, and the second scale information group. Calculate the relationship.
- the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the corresponding point information group, the first direction information group, and the second direction information group (step S1504).
- the corresponding point selection unit 1403 clusters the feature points based on the corresponding point relative scale information group and the corresponding point relative direction information group, and selects the feature points based on the clustering result (step S1505). And it changes to step S1506 shown in FIG.
- Embodiment 5 (Effect of Embodiment 5)
- the same effect as in the first embodiment is obtained, and the relative direction between the feature points correctly associated between the images is constant for each subject.
- the feature points are clustered based on the relative direction information, so that the erroneous association between the feature points of the first image and the feature points of the second image can be deleted.
- Embodiment 5 can improve recognition accuracy over Embodiment 1.
- FIG. 22 is a block diagram illustrating a configuration of the image processing apparatus 220 according to the sixth embodiment. As shown in FIG. 22, the image processing apparatus 220 has the same configuration as the image processing apparatus 20 shown in FIG. .
- the first local feature value generation unit 2201 generates a first local feature value group by the same operation as the first local feature value generation unit 201 according to the first embodiment, and outputs the first local feature value group to the corresponding point calculation unit 203. .
- the first local feature quantity generation unit 2201 outputs the first scale information group generated from the feature points to the corresponding point relative scale calculation unit 204.
- the first local feature value generation unit 2201 outputs a first coordinate value information group configured by the coordinate values of the feature points detected from the first image to the corresponding point selection unit 2202.
- the corresponding point selection unit 2202 uses the corresponding point relative scale information group output from the corresponding point relative scale calculation unit 204 and the first coordinate value information group output from the first local feature quantity generation unit 2201, and The feature points included in one image are clustered. Then, the corresponding point selection unit 2202 selects feature points based on the clustering result, and outputs a selection information group. Since the selection of the feature points and the output of the selection information group are the same as those in the above embodiment, detailed description thereof is omitted.
- Corresponding point selection unit 2202 outputs a selection information group including a corresponding point information group of the selected feature points and a cluster information group to determination unit 206.
- the cluster information group includes a plurality of pieces of cluster information relating to a plurality of clusters including one or more feature points.
- the corresponding point selection unit 2202 clusters the feature points based on the corresponding point relative scale information group, selects the feature points based on the clustering result, and then selects the selected feature points as the first coordinate value information group. Clustering may be performed based on
- the corresponding point selection unit 2202 may cluster the feature points using, for example, both the corresponding point relative scale information and the first coordinate value information group.
- the same method as in the above-described embodiment may be used, and detailed description thereof is omitted.
- FIG. 23 is a flowchart showing the operation of this embodiment. The operation of this embodiment will be described in detail with reference to FIG. Note that step S2301 to step S2303 and step S2306 shown in FIG. 23 are the same as step S301 to step S303 and step S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 establishes the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group. Calculate (step S2304).
- the corresponding point selection unit 2202 clusters the feature points based on the corresponding point relative scale information group and the first coordinate value information group, and selects the feature points based on the clustering result (step S2305). And it changes to step S2306 shown in FIG.
- the seventh embodiment will be described in detail.
- the relative coordinate values of feature points are calculated and used for clustering.
- FIG. 42 is a block diagram illustrating a functional configuration of the image processing apparatus 420 according to the seventh embodiment.
- the image processing device 420 has the same configuration as the image processing device 140 shown in FIG. 14, and includes a second local feature value generation unit 4201, a relative coordinate value calculation unit 4202, and a corresponding point selection unit.
- the configuration and operation of 4203 are different.
- the first local feature value generation unit 3401 generates a first local feature value group, a first direction information group, and a first coordinate value information group, and the corresponding point calculation unit 203 and the corresponding point relative direction, respectively.
- the data is output to the calculation unit 1003 and the corresponding point selection unit 4203.
- the first local feature quantity generation unit 3401 outputs the first scale information group generated by the same operation as the first local feature quantity generation unit 201 shown in FIG. 2 to the corresponding point relative scale calculation unit 204. .
- the second local feature quantity generation unit 4201 calculates corresponding points for the second local feature quantity group, the second scale information group, the second direction information group, and the second coordinate value information group, respectively. Output to the unit 203, the corresponding point relative scale calculation unit 204, the corresponding point relative direction calculation unit 1003, and the relative coordinate value calculation unit 4202.
- the relative coordinate value calculation unit 4202 includes a corresponding point relative scale information group output from the corresponding point relative scale calculation unit 204, a corresponding point relative direction information group output from the corresponding point relative direction calculation unit 1003, and a first local feature amount.
- the coordinate value of the feature point is converted to a point in an arbitrary coordinate system.
- the relative coordinate value calculation unit 4202 outputs the converted coordinate value (hereinafter referred to as the corresponding point relative coordinate value) to the corresponding point selection unit 4203.
- the reference point is a predetermined coordinate value, and may be an arbitrary point in the same Cartesian coordinate system as that of the second image, for example. In the following description, the subject center point is used as a reference point.
- the relative coordinate value calculation unit 4202 corresponds to a reference point coordinate value (for example, a subject center point) that is an arbitrary reference point of the second image, a second coordinate value information group, a corresponding point relative scale information group, and a corresponding point. Based on the point relative direction information group and the first coordinate value information group, the center point of each subject in the first image is calculated. Then, the relative coordinate value calculation unit 4202 outputs a relative coordinate value information group composed of the calculated subject center points to the corresponding point selection unit 4203.
- a reference point coordinate value for example, a subject center point
- Equation 7 i and j are the feature point numbers of the first image and the second image
- v i is the coordinate value of the i-th feature point of the first image
- ⁇ ij is the corresponding point relative scale.
- ⁇ ij is the relative direction of the corresponding point
- c ij is the coordinate value of the subject center in the first image
- u j ′ is the subject center point of the second image from the jth feature point of the second image.
- the vector to For example, the vector may be calculated as shown in the following equation (8).
- Equation 8 Where x j is the x coordinate value of the j th feature point, y j is the y coordinate value of the j th feature point, and x c is the x coordinate value of the selected reference point of the second image. , Y c indicate the y coordinate value of the reference point of the second image.
- the corresponding point selection unit 4203 is included in the first image using the relative coordinate value information group output from the relative coordinate value calculation unit 4202 and the corresponding point relative scale information group output from the corresponding point relative scale calculation unit 204. Clustered feature points. Then, the corresponding point selection unit 4203 selects feature points based on the clustering result, and outputs a selection information group. Since the selection of the feature points and the output of the selection information group are the same as those in the above embodiment, detailed description thereof is omitted.
- the corresponding point selection unit 4203 may use, for example, the same method as in the sixth embodiment for clustering of feature points based on the corresponding point relative scale group and selecting corresponding points. Further, for clustering of feature points based on the relative coordinate value information group and selection of corresponding points, for example, a method similar to the method of clustering feature points based on the first coordinate value information group in the sixth embodiment is used. May be.
- the relative coordinate value calculation unit 4202 may not use the reference point coordinate value, for example.
- the relative coordinate value calculation unit 4202 and the corresponding point selection unit 4203 may operate as follows, for example.
- the relative coordinate value calculation unit 4202 is based on the second coordinate value information group, the corresponding point relative scale information group, the corresponding point relative direction information group, and the first coordinate value information group. A relative movement amount between each feature point and each feature point in the second image is calculated. Then, the relative coordinate value calculation unit 4202 outputs a relative coordinate value information group including the calculated relative movement amounts to the corresponding point selection unit 4203. In this case, for example, the following equation (9) may be used to calculate the corresponding point relative coordinate value. [Equation 9] Here, v jj represents the relative movement amount, and v j represents the coordinate value of the j-th feature point of the second image.
- the corresponding point selection unit 4203 performs, for example, clustering of feature points and selection of corresponding points based on the relative coordinate value information group.
- a method similar to the method of clustering the feature points based on the first coordinate value information group in the form 6 may be used.
- the relative coordinate value calculation unit 4202 may calculate the relative coordinate value information group as described above when no reference point coordinate value is input.
- FIG. 43 is a flowchart showing the operation of this embodiment. The operation of this embodiment will be described in detail with reference to FIG. Note that steps S4301 to S4303 and step S4306 shown in FIG. 43 are the same as steps S301 to S303 and S306 described above, and thus detailed description thereof is omitted.
- the corresponding point relative scale calculation unit 204 calculates a relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the corresponding point relative direction calculation unit 1003 calculates the corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S4304).
- the relative coordinate value calculation unit 4202 uses the corresponding point relative scale information group, the corresponding point relative direction information group, the reference point coordinate value, the first coordinate value information group, and the second coordinate value information group, A relative coordinate value is calculated (step S4305).
- the corresponding point selection unit 4203 clusters the feature points based on the relative coordinate value information group and the corresponding point relative scale information group, and selects the feature points based on the clustering result (step S4306). Then, the processing proceeds to step S4307 shown in FIG.
- the seventh embodiment the same effects as in the first embodiment can be obtained, and the feature points of the first image are collected at the subject center and then clustered, so that the feature points are clustered more accurately than in the first embodiment. it can. Therefore, the seventh embodiment can identify the same or similar subject in the image with higher accuracy than the first embodiment.
- FIG. 8 is a block diagram illustrating a functional configuration of the image processing apparatus 80 according to the first modification.
- the image processing device 80 has the same configuration as the image processing device 60 shown in FIG. 6, and the configuration and operation of the corresponding point selection unit 401 are different. Further, the corresponding point selection unit 401 has the same configuration and operation as those of the second embodiment, and a description thereof is omitted here.
- FIG. 9 is a flowchart showing the operation of the first modification. The operation of the first modification will be described in detail with reference to FIG. Note that steps S901 to S903 and S906 shown in FIG. 9 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 is based on the first scale information group, the second scale information group, and the corresponding point information group.
- the relative scale relationship is calculated and normalized based on the image size or the actual size of the subject (step S904).
- the corresponding point selection unit 401 clusters feature points based on the corresponding point normalized relative scale information group and the relative scale range, and selects feature points based on the clustering result (step S905). And it changes to step S906 shown in FIG.
- FIG. 12 is a block diagram illustrating a functional configuration of the image processing apparatus 120 according to the second modification.
- the image processing apparatus 120 has the same configuration as the image processing apparatus 100 shown in FIG. 10, and the configuration and operation of the corresponding point selection unit 1201 are different.
- the corresponding point selection unit 1201 clusters the feature points included in the first image using the corresponding point relative direction information group output by the corresponding point relative direction calculation unit 1003 and the relative direction range.
- the relative direction range may be, for example, a value based on experience or observation, a value learned mechanically using learning data, or a value calculated from the corresponding point relative direction information group by unsupervised clustering. Further, the relative direction range may be one or plural.
- the corresponding point selection unit 1201 has, for example, feature points having relative direction information distributed in the relative direction range. Can be clustered into the same cluster. For clustering of feature points and selection of corresponding points, for example, a method similar to that of the first modification may be used.
- FIG. 13 is a flowchart showing the operation of the second modification. The operation of the modification 2 will be described in detail with reference to FIG. Note that steps S1301 to S1303 and S1306 shown in FIG. 13 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative direction calculation unit 1003 is based on the first direction information group, the second direction information group, and the corresponding point information group.
- Relative direction information is calculated (step S1304).
- the corresponding point selection unit 1201 clusters the feature points based on the corresponding point relative direction information group and the relative direction range, and selects the feature points based on the clustering result (step S1305). And it changes to step S1306 shown in FIG.
- FIG. 16 is a block diagram illustrating a functional configuration of the image processing apparatus 160 according to the third modification.
- the image processing device 160 has the same configuration as the image processing device 140 shown in FIG. 14, and the configuration and operation of the corresponding point selection unit 1601 are different.
- the corresponding point selection unit 1601 includes a corresponding point relative scale information group output from the corresponding point relative scale calculation unit 204, a corresponding point relative direction information group output from the corresponding point relative direction calculation unit 1003, and a corresponding point relative scale range.
- the feature points included in the first image are clustered using the corresponding point relative direction range.
- the corresponding point selecting unit 1601 includes, for example, the corresponding point relative scale information.
- the feature points having the corresponding point relative scale information and the corresponding point relative direction information distributed in the relative direction range may be classified into the same cluster.
- the same method as in the fifth embodiment may be used.
- FIG. 17 is a flowchart showing the operation of the third modification. The operation of the third modification will be described in detail with reference to FIG. Note that steps S1701 to S1703 and S1706 shown in FIG. 17 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 calculates the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group. .
- the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S1704).
- the corresponding point selection unit 1601 clusters the feature points based on the corresponding point relative scale information group, the corresponding point relative direction information group, the relative scale range, and the relative direction range, and determines the feature points based on the clustering result. Selection is made (step S1705). And it changes to step S1706 shown in FIG.
- FIG. 18 is a block diagram illustrating a functional configuration of an image processing apparatus 180 according to the fourth modification.
- the image processing apparatus 180 has the same configuration as the image processing apparatus 140 shown in FIG. 14, and the configuration and operation of the corresponding point normalized relative scale calculation unit 601 are different.
- the operation of the corresponding point normalized relative scale calculation unit 601 is the same as that of the first modification, and thus the description thereof is omitted.
- FIG. 19 is a flowchart showing the operation of the fourth modification. The operation of the modification 4 will be described in detail with reference to FIG. Note that steps S1901 to S1903 and S1906 shown in FIG. 19 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 uses the relative scale based on the first scale information group, the second scale information group, and the corresponding point information group. The relationship is calculated, and the calculated relative scale relationship is normalized based on the image size or the actual subject size. Also, the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S1904). The corresponding point selection unit 1403 clusters the feature points based on the corresponding point normalized relative scale information and the corresponding point relative direction information, and selects the feature points based on the clustering result (step S1905). And it changes to step S1906 shown in FIG.
- FIG. 20 is a block diagram illustrating a functional configuration of an image processing apparatus 200 according to Modification 5.
- the image processing apparatus 200 has the same configuration as the image processing apparatus 180 illustrated in FIG. 18, and the configuration and operation of the corresponding point selection unit 1601 are different. Further, the operation of the corresponding point selection unit 1601 is the same as that of the third modification, and thus the description thereof is omitted.
- FIG. 21 is a flowchart showing the operation of the fifth modification. The operation of the modification 5 will be described in detail with reference to FIG. Note that steps S2101 to S2103 and S2106 shown in FIG. 21 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 After executing the processing of step S2101 to step S2103 shown in FIG. 21, the corresponding point normalized relative scale calculation unit 601 has a relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group. And the calculated relative scale relationship is normalized based on the image size or the actual subject size.
- the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S2104).
- the corresponding point selection unit 1601 clusters the feature points based on the corresponding point normalized relative scale information group, the corresponding point relative direction information group, the relative scale range, and the relative direction range, and based on the clustering result A point is selected (step S2105). And it changes to step S2106 shown in FIG.
- FIG. 24 is a block diagram illustrating a functional configuration of an image processing apparatus 240 according to the sixth modification.
- the image processing apparatus 240 has the same configuration as the image processing apparatus 220 shown in FIG. 22, and the configuration and operation of the corresponding point selection unit 2401 are different.
- the corresponding point selection unit 2401 includes a corresponding point relative scale information group and a relative scale range output from the corresponding point relative scale calculation unit 204, a feature point coordinate value output from the first local feature quantity generation unit 2201, and a coordinate value.
- the feature points included in the first image are clustered using the range.
- the relative coordinate value range may be a coordinate value indicating a range where the subject range in the image exists, for example.
- the range where the subject range exists for example, it may be a rectangle circumscribing point p and point q based on experience and observation, and where the subject is likely to exist using learning data
- the subject existence probability may be calculated mechanically, and may be an area having a predetermined probability density or higher.
- the relative coordinate value range may be one or plural.
- the corresponding point selecting unit 2401 has, for example, feature points having a relative scale value distributed within the relative scale range. Can be classified into the same cluster. Further, the corresponding point selection unit 2401 may classify the feature points within the relative coordinate value range into the same cluster, for example. For clustering based on the relative scale and clustering based on the coordinate values, for example, a method similar to that in the first embodiment or the sixth embodiment may be used. For the selection of the corresponding points, for example, a method similar to that of Modification 5 may be used.
- the corresponding point selection unit 2401 clusters, for example, feature points based on the corresponding point relative direction information group, selects the feature points based on the clustering result, and then selects based on the first coordinate value information group.
- the feature points may be clustered.
- the corresponding point selection unit 2401 clusters the feature points based on the first coordinate value information group, selects the feature points based on the clustering result, and then selects the feature points based on the corresponding point relative direction information group.
- the feature points may be clustered.
- the corresponding point selection unit 2401 may cluster feature points using both the corresponding point relative direction information group and the first coordinate value information group.
- FIG. 25 is a flowchart showing the operation of the sixth modification. The operation of the modified example 6 will be described in detail with reference to FIG. Note that steps S2501 to S2503 and S2506 shown in FIG. 25 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 establishes the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group. Calculate (step S2504).
- the corresponding point selection unit 2401 clusters the feature points based on the corresponding point relative scale information group, the first coordinate value information group, the relative scale range, and the relative coordinate value range, and based on the clustering result A point is selected (step S2505). Then, the processing proceeds to step S2506 shown in FIG.
- FIG. 26 is a block diagram illustrating a functional configuration of an image processing apparatus 260 according to the seventh modification.
- the image processing device 260 has the same configuration as that of the image processing device 220 shown in FIG. 22, and the configuration and operation of the corresponding point normalized relative scale calculation unit 601 are different.
- the operation of the corresponding point normalized relative scale calculation unit 601 is the same as that of the first modification, and thus the description thereof is omitted.
- FIG. 27 is a flowchart showing the operation of the seventh modification. The operation of the modification example 7 will be described in detail with reference to FIG. Note that steps S2701 to S2703 and S2706 shown in FIG. 27 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 uses the relative scale based on the first scale information group, the second scale information group, and the corresponding point information group. The relationship is calculated, and the calculated relative scale relationship is normalized based on the image size or the actual subject size (step S2704).
- the corresponding point selection unit 2202 clusters the feature points based on the corresponding point normalized relative scale information and the first coordinate value information group, and selects the feature points based on the clustering result (step S2705). And it changes to step S2706 shown in FIG.
- FIG. 28 is a block diagram illustrating a functional configuration of an image processing apparatus 280 according to Modification 8.
- the image processing apparatus 280 has the same configuration as the image processing apparatus 260 shown in FIG. 26, and the configuration and operation of the corresponding point selection unit 2401 are different.
- the operation of the corresponding point selection unit 2401 is the same as that of the sixth modification, and thus the description thereof is omitted.
- FIG. 29 is a flowchart showing the operation of this embodiment. The operation of the present embodiment will be described in detail using FIG. Note that steps S2901 to S2903 and S2906 shown in FIG. 29 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 establishes a relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the calculated relative scale relationship is normalized based on the image size or the actual subject size (step S2904).
- the corresponding point selection unit 2401 clusters the feature points based on the corresponding point relative scale, the first coordinate value information group, the relative scale range, and the relative coordinate value range, and selects the feature points based on the clustering result (step) S2905). And it changes to step S2906 shown in FIG.
- FIG. 30 is a block diagram illustrating a functional configuration of an image processing apparatus 300 according to the ninth modification.
- the image processing apparatus 300 has the same configuration as the image processing apparatus 100 shown in FIG. 10, and the configuration and operation of the first local feature quantity generating unit 3001 and the corresponding point selecting unit 3002 are different. .
- the first local feature generation unit 3001 outputs the first direction information group generated by the same operation as the first local feature generation unit 201 according to the first embodiment to the corresponding point relative direction calculation unit 1003. In addition, the first local feature quantity generation unit 3001 generates the first local feature quantity group and the first coordinate value information group generated by the same operation as the first local feature quantity generation unit 2201 according to the sixth embodiment. Are output to the corresponding point calculation unit 203 and the corresponding point selection unit 3002, respectively.
- the corresponding point selection unit 3002 uses the corresponding point relative direction information group output from the corresponding point relative direction calculation unit 1003 and the first coordinate value information group output from the first local feature value generation unit 3001. The feature points included in one image are clustered.
- FIG. 31 is a flowchart showing the operation of the ninth modification. The operation of the modification 9 will be described in detail with reference to FIG. Note that steps S3101 to S3103 and step S3106 shown in FIG. 31 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative direction calculation unit 1003 After executing the processing of steps S3101 to S3103 shown in FIG. 31, the corresponding point relative direction calculation unit 1003 performs the corresponding point relative direction based on the first direction information group, the second direction information group, and the corresponding point information group. Information is calculated (step S3104).
- the corresponding point selection unit 3002 clusters the feature points based on the corresponding point relative direction information group and the first coordinate value information group, and selects the feature points based on the clustering result (step S3105). And it changes to step S3106 shown in FIG.
- FIG. 32 is a block diagram illustrating a functional configuration of an image processing device 320 according to the tenth modification.
- the image processing device 320 has the same configuration as the image processing device 300 shown in FIG. 30, and the configuration and operation of the corresponding point selection unit 3201 are different. Therefore, the corresponding point selection unit 3201 will be described.
- the corresponding point selection unit 3201 includes a corresponding point relative direction information group and a relative direction range output from the corresponding point relative direction calculation unit 1003, and a feature point coordinate value and a coordinate value range output from the first local feature quantity generation unit 3001. And feature points included in the first image are clustered.
- the corresponding point selection unit 3201 clusters feature points based on, for example, the corresponding point relative direction information group, selects the feature points based on the clustering result, and then selects based on the first coordinate value information group.
- the feature points may be clustered.
- the corresponding point selection unit 3201 clusters the feature points based on the first coordinate value information group, selects the feature points based on the clustering result, and then selects the feature points based on the corresponding point relative direction information group.
- the feature points may be clustered.
- the corresponding point selection unit 3201 may cluster feature points using both the corresponding point relative direction information group and the first coordinate value information group, for example.
- FIG. 33 is a flowchart showing the operation of the modification 10. The operation of the modification 10 will be described in detail with reference to FIG. Note that steps S3301 to S3303 and step S3306 shown in FIG. 33 are the same as steps S301 to S303 and step S306 shown in FIG.
- the corresponding point relative direction calculation unit 1003 After executing the processing of steps S3301 to S3303 shown in FIG. 33, the corresponding point relative direction calculation unit 1003 obtains the corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group. Calculate (step S3304).
- the corresponding point selection unit 3201 clusters the feature points based on the corresponding point relative direction information group, the first coordinate value information group, the relative direction range, and the relative coordinate value range, and selects the feature points based on the clustering result. (Step S3305). And it changes to step S3306 shown in FIG.
- FIG. 34 is a block diagram illustrating a functional configuration of an image processing device 340 according to the eleventh modification.
- the image processing device 340 has the same configuration as that of the image processing device 140 shown in FIG. 14, and the configuration and operation of the first local feature value generation unit 3401 and corresponding point selection unit 3402 are different. .
- the first local feature value generation unit 3401 operates in the same manner as the first local feature value generation unit 3001 illustrated in FIG. 30, and performs the first local feature value group, the first direction information group, and the first coordinate value.
- the information group is generated and output to the corresponding point calculation unit 203, the corresponding point relative direction calculation unit 1003, and the corresponding point selection unit 3402, respectively.
- the first local feature value generation unit 3401 outputs the first scale information group generated by the same operation as the first local feature value generation unit 201 according to the first embodiment to the corresponding point relative scale calculation unit 204. To do.
- the corresponding point selection unit 3402 includes a corresponding point relative scale information group output from the corresponding point relative scale calculator 204, a corresponding point relative direction information group output from the corresponding point relative direction calculator 1003, and a first local feature generation.
- the feature points included in the first image are clustered using the first coordinate value information group output by the unit 3401.
- FIG. 35 is a flowchart showing the operation of this embodiment. The operation of this embodiment will be described in detail with reference to FIG. Note that steps S3501 to S3503 and S3506 shown in FIG. 35 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 calculates the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S3504).
- the corresponding point selection unit 3402 clusters the feature points based on the corresponding point relative scale information group, the corresponding point relative direction information group, and the first coordinate value information group, and selects the feature points based on the clustering result ( Step S3505). Then, the processing proceeds to step S3506 shown in FIG.
- FIG. 36 is a block diagram illustrating a functional configuration of an image processing apparatus 360 according to the twelfth modification.
- the image processing device 360 has the same configuration as the image processing device 340 shown in FIG. 34, and the configuration and operation of the corresponding point selection unit 3601 are different.
- the corresponding point selection unit 3601 includes a corresponding point relative scale information group output by the corresponding point relative scale calculator 204, a corresponding point relative direction information group output by the corresponding point relative direction calculator 1003, and a first local feature generation.
- the feature points included in the first image are clustered using the first coordinate value information group output by the unit 3401, the relative scale range, the relative direction range, and the relative coordinate value range.
- FIG. 37 is a flowchart showing the operation of the modification 12. The operation of the modification 12 will be described in detail with reference to FIG. Note that steps S3701 to S3703 and S3706 shown in FIG. 37 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 calculates a relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group. .
- the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S3704).
- the corresponding point selection unit 3601 is based on the corresponding point relative scale information group, the corresponding point relative direction information group, the first coordinate value information group, the relative scale range, the relative direction range, and the relative coordinate value range.
- the feature points are clustered, and the feature points are selected based on the clustering result (step S3705). And it changes to step S3706 shown in FIG.
- the modified example 12 has the same effect as the modified example 11, and has a feature point having relative scale information in the relative scale range, a feature point having relative direction information in the relative direction range, and a relative coordinate value range. Since only feature points having the coordinate value information are clustered, the amount of calculation can be reduced as compared with the eleventh modification.
- FIG. 38 is a block diagram illustrating a functional configuration of an image processing apparatus 380 according to Modification 13.
- the image processing device 380 has the same configuration as the image processing device 340 shown in FIG. 34, and the configuration and operation of the corresponding point normalized relative scale calculation unit 601 are different.
- the operation of the corresponding point normalized relative scale calculation unit 601 is the same as that in the fourteenth embodiment, and a description thereof will be omitted.
- FIG. 39 is a flowchart showing the operation of the modification 13. The operation of the modification 13 will be described in detail with reference to FIG. Note that steps S3901 to S3903 and S3906 shown in FIG. 39 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 establishes the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group. calculate. Also, the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S3904). The corresponding point selection unit 3402 clusters the feature points based on the corresponding point relative scale information group, the corresponding point relative direction information group, and the first coordinate value information group, and selects the feature points based on the clustering result. (Step S3905). And it changes to step S3906 shown in FIG.
- FIG. 40 is a block diagram illustrating a functional configuration of an image processing apparatus 400 according to the modification 14.
- the image processing apparatus 400 has the same configuration as the image processing apparatus 380 shown in FIG. 38, and the configuration and operation of the corresponding point selection unit 3601 are different. Further, the operation of the corresponding point selection unit 3601 is the same as that of the modified example 12, and thus the description thereof is omitted.
- FIG. 41 is a flowchart showing the operation of the modification 14. The operation of the modification 14 will be described in detail with reference to FIG. Note that steps S4101 to S4103 and S4106 shown in FIG. 41 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 After executing the processing of steps S4101 to S4103 shown in FIG. 41, the corresponding point normalized relative scale calculation unit 601 establishes the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group. calculate. Then, the corresponding point normalization relative scale calculation unit 601 normalizes the calculated relative scale relationship based on the image size or the actual subject size. The corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S4104).
- the corresponding point selection unit 3601 selects feature points based on the corresponding point normalized relative scale information group, the corresponding point relative direction information group, the first coordinate value information group, the relative scale range, the relative direction range, and the relative coordinate value range. Clustering is performed, and feature points are selected based on the clustering result (step S4105). And it changes to step S4106 shown in FIG.
- the modification 14 has the same effect as that of the modification 13, and has a feature point having a normalized relative scale value in the relative scale range, a feature point having direction information in the relative direction range, and a relative coordinate value range. Since only the feature points in the cluster are clustered, the amount of calculation can be reduced as compared with the modified example 13.
- FIG. 44 is a block diagram illustrating a functional configuration of an image processing device 440 according to Modification 15.
- the image processing device 440 has the same configuration as the image processing device 420 shown in FIG. 42, and the configuration and operation of the corresponding point selection unit 4401 are different.
- the corresponding point selection unit 4401 includes a corresponding point relative scale information group and a relative scale range output from the corresponding point relative scale calculation unit 204, and a relative coordinate value information group and a relative coordinate value range output from the relative coordinate value calculation unit 4202. Is used to cluster feature points.
- FIG. 45 is a flowchart showing the operation of the modification 15. The operation of the modification 15 will be described in detail with reference to FIG. Note that step S4501 to step S4503 and step S4507 shown in FIG. 45 are the same as step S301 to step S303 and step S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 calculates the corresponding point relative scale, and the corresponding point relative direction calculation unit 1003 calculates the corresponding point relative direction (step S4504).
- the relative coordinate value calculation unit 4202 uses the corresponding point relative scale information group, the corresponding point relative direction information group, the reference point coordinate value, the first coordinate value information group, and the second coordinate value information group, A relative coordinate value is calculated (step S4505).
- the corresponding point selection unit 4401 clusters the feature points based on the corresponding point relative scale information group and the relative scale range, and the relative coordinate value information group and the relative coordinate value range, and selects the feature points based on the clustering result ( Step S4506). Then, the process proceeds to step S4507 shown in FIG.
- the same effect as in the seventh embodiment is obtained, and only feature points having a relative scale value within the relative scale value range and feature points having a relative coordinate value within the relative coordinate value range are clustered.
- the amount of calculation can be reduced compared to the seventh embodiment.
- FIG. 46 is a block diagram illustrating a functional configuration of the image processing device 460 according to the modification 16. As illustrated in FIG. As shown in FIG. 46, the image processing device 460 has the same configuration as the image processing device 420 shown in FIG. 42, and the configuration and operation of the corresponding point normalized relative scale calculation unit 601 are different. In addition, the operation of the corresponding point normalized relative scale calculation unit 601 is the same as that of the modified example 14, and thus the description thereof is omitted.
- FIG. 47 is a flowchart showing the operation of the modification 16. The operation of the modification 16 will be described in detail with reference to FIG. Note that step S4701 to step S4703 and step S4707 shown in FIG. 47 are the same as step S301 to step S303 and step S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 establishes the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the calculated relative scale relationship is normalized based on the image size or the actual subject size.
- the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S4704).
- the relative coordinate value calculation unit 4202 uses a corresponding point normalized relative scale information group, a corresponding point relative direction information group, a reference point coordinate value, a first coordinate value information group, and a second coordinate value information group.
- the relative coordinate value is calculated (step S4705).
- the corresponding point selection unit 4203 clusters the feature points based on the corresponding point normalized relative scale information group and the relative scale range, and the relative coordinate value information group and the relative coordinate value range, and selects the feature points based on the clustering result. (Step S4706). And it changes to step S4707 shown in FIG.
- the modification 16 has the same effect as that of the seventh embodiment. Further, when the relative scale between the feature points correctly associated between the images is normalized, the normalized corresponding point relative scale is constant for all the subjects in the first image. Therefore, in the modification 16, even if subjects having different sizes between images are included by clustering the feature points based on the normalized relative scale values of the corresponding points, they are identified more accurately than in the seventh embodiment. can do.
- FIG. 48 is a block diagram illustrating a functional configuration of an image processing device 480 according to the modification 17.
- the image processing device 480 has the same configuration as the image processing device 460 shown in FIG. 46, and the configuration and operation of the corresponding point selection unit 4401 are different. Further, the corresponding point selection unit 4401 is the same as that of the modified example 16, and thus the description thereof is omitted.
- FIG. 49 is a flowchart showing the operation of the modification 17. The operation of the modification 17 will be described in detail with reference to FIG. Note that steps S4901 to S4903 and S4907 shown in FIG. 49 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 establishes a relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the calculated relative scale relationship is normalized based on the image size or the actual subject size.
- the corresponding point relative direction calculation unit 1003 calculates the corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S4904).
- the relative coordinate value calculation unit 4202 uses a corresponding point normalized relative scale information group, a corresponding point relative direction information group, a reference point coordinate value, a first coordinate value information group, and a second coordinate value information group.
- the relative coordinate value is calculated (step S4905).
- the corresponding point selection unit 4401 clusters the feature points based on the corresponding point relative scale information group and the relative scale range, and the relative coordinate value information group and the relative coordinate value range, and selects the feature points based on the clustering result (Ste S4906). And it changes to step S4907 shown in FIG.
- FIG. 50 is a block diagram illustrating a functional configuration of an image processing apparatus 500 according to Modification 18.
- the image processing apparatus 500 has the same configuration as that of the image processing apparatus 340 shown in FIG. 34, and includes a second local feature quantity generation unit 4201, a relative coordinate value calculation unit 4202, and a corresponding point selection unit.
- the configuration and operation of 5001 are different.
- the configurations and operations of the second local feature value generation unit 4201 and the relative coordinate value calculation unit 4202 are the same as those of the modified example 17, and thus the description thereof is omitted.
- the corresponding point selection unit 5001 clusters the feature points using the relative coordinate value information group output from the relative coordinate value calculation unit 4202 and the corresponding point relative direction information group output from the corresponding point relative direction calculation unit 1003, and performs clustering. Feature points are selected based on the results.
- FIG. 51 is a flowchart showing the operation of the modification 18. The operation of the modification 18 will be described in detail with reference to FIG. Note that steps S5101 to S5103 and S5107 shown in FIG. 51 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 establishes the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the corresponding point relative direction calculation unit 1003 calculates the corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S5104).
- the relative coordinate value calculation unit 4202 uses the corresponding point relative scale information group, the corresponding point relative direction information group, the reference point coordinate value, the first coordinate value information group, and the second coordinate value information group, A relative coordinate value is calculated (step S5105).
- the corresponding point selection unit 5001 clusters the feature points based on the corresponding point relative direction information group and the relative coordinate value information group, and selects the feature points based on the clustering result (step S5106). Then, the processing proceeds to step S5107 shown in FIG.
- FIG. 52 is a block diagram showing a functional configuration of the image processing apparatus 520 shown in the modification 19. As shown in FIG. As shown in FIG. 52, the image processing apparatus 520 has the same configuration as the image processing apparatus 500 shown in FIG. 50, and the configuration and operation of the corresponding point selection unit 5201 are different.
- the corresponding point selection unit 5201 includes a relative coordinate value information group and a relative coordinate value range output from the relative coordinate value calculation unit 4202, and a corresponding point relative direction information group and a relative direction range output from the corresponding point relative direction calculation unit 1003. Is used to cluster feature points.
- FIG. 53 is a flowchart showing the operation of the modification 19. The operation of the modification 19 will be described in detail with reference to FIG. Note that steps S5301 to S5303 and S5307 shown in FIG. 53 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 establishes the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the corresponding point relative direction calculation unit 1003 calculates the corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S5304).
- the relative coordinate value calculation unit 4202 uses the corresponding point relative scale information group, the corresponding point relative direction information group, the reference point coordinate value, the first coordinate value information group, and the second coordinate value information group, A coordinate value is calculated (step S5305).
- the corresponding point selection unit 5201 clusters the feature points based on the corresponding point relative direction information group, the relative coordinate value information group, the relative direction range, and the relative coordinate value range, and selects the feature points based on the clustering result (step) S5306). Then, the processing proceeds to step S5307 shown in FIG.
- FIG. 54 is a block diagram illustrating a functional configuration of an image processing device 540 according to the modification 20.
- the image processing device 540 has the same configuration as the image processing device 420 shown in FIG. 42, and the configuration and operation of the corresponding point selection unit 5401 are different.
- the corresponding point selection unit 5401 outputs the relative coordinate value information group output from the relative coordinate value calculation unit 4202, the corresponding point relative scale information group output from the corresponding point relative scale calculation unit 204, and the corresponding point relative direction calculation unit 1003.
- the feature points are clustered using the corresponding point relative direction information group.
- FIG. 55 is a flowchart showing the operation of the modification 20. The operation of the modification 20 will be described in detail with reference to FIG. Note that steps S5501 to S5503 and S5507 shown in FIG. 55 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 establishes the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the corresponding point relative direction calculation unit 1003 calculates the corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S5504).
- the relative coordinate value calculation unit 4202 uses the corresponding point relative scale information group, the corresponding point relative direction information group, the reference point coordinate value, the first coordinate value information group, and the second coordinate value information group. Then, the relative coordinate value is calculated (step S5505).
- the corresponding point selection unit 5401 clusters the feature points based on the corresponding point relative scale information group, the corresponding point relative direction information group, and the relative coordinate value information group, and selects the feature points based on the clustering result (step S5506). ). Then, the processing proceeds to step S5507 shown in FIG.
- FIG. 56 is a block diagram illustrating a functional configuration of the image processing apparatus 560 according to the modification 21. As illustrated in FIG. As shown in FIG. 56, the image processing apparatus 560 has the same configuration as the image processing apparatus 540 shown in FIG. 54, and the configuration and operation of the corresponding point selection unit 5601 are different.
- the corresponding point selection unit 5601 includes a relative coordinate value information group output by the relative coordinate value calculation unit 4202, a relative coordinate value range, a corresponding point relative scale information group output by the corresponding point relative scale calculation unit 204, and a relative scale range.
- the feature points are clustered using the corresponding point relative direction information group and the relative direction range output by the corresponding point relative direction calculation unit 1003.
- FIG. 57 is a flowchart showing the operation of the modification 21. The operation of the modification 21 will be described in detail with reference to FIG. Note that steps S5701 to S5703 and S5707 shown in FIG. 57 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point relative scale calculation unit 204 calculates a relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the corresponding point relative direction calculation unit 1003 calculates the corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S5704).
- the relative coordinate value calculation unit 4202 uses the corresponding point relative scale information group, the corresponding point relative direction information group, the reference point coordinate value, the first coordinate value information group, and the second coordinate value information group, A relative coordinate value is calculated (step S5705).
- the corresponding point selection unit 5601 clusters the feature points based on the corresponding point relative scale information group, the corresponding point relative direction information group, the relative coordinate value information group, the relative scale range, the relative direction range, and the relative coordinate value range.
- a feature point is selected based on the result (step S5706). And it changes to step S5707 shown in FIG.
- the modification 21 has the same effect as the modification 20, and has a feature point having a relative scale value within the relative scale range, a feature point having direction information within the relative direction range, and a relative point within the relative coordinate value range. Since only feature points having coordinate values are clustered, the amount of calculation can be reduced as compared with the modified example 20.
- FIG. 58 is a block diagram illustrating a functional configuration of an image processing apparatus 580 according to the modification 22.
- the image processing device 580 has the same configuration as that of the image processing device 540 shown in FIG. 54, and the configuration and operation of the corresponding point normalized relative scale calculation unit 601 are different.
- the operation of the corresponding point normalized relative scale calculation unit 601 is the same as that of the modified example 17, and thus the description thereof is omitted.
- FIG. 59 is a flowchart showing the operation of the modification 22. The operation of the modification 22 will be described in detail with reference to FIG. Note that steps S5901 to S5903 and S5907 shown in FIG. 59 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 establishes the relative scale relationship based on the first scale information group, the second scale information group, and the corresponding point information group.
- the calculated relative scale relationship is normalized based on the image size or the actual subject size.
- the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S5904).
- the relative coordinate value calculation unit 4202 uses the corresponding point relative scale information group, the corresponding point relative direction information group, the reference point coordinate value, the first coordinate value information group, and the second coordinate value information group, A coordinate value is calculated (step S5905).
- the corresponding point selection unit 5401 clusters the feature points based on the corresponding point relative scale information group, the corresponding point relative direction information group, and the relative coordinate value information group, and selects the feature points based on the clustering result (step S5906). ). Then, the processing proceeds to step S5907 shown in FIG.
- FIG. 60 is a block diagram illustrating a functional configuration of an image processing apparatus 600 according to the modification example 23.
- the image processing apparatus 600 has the same configuration as the image processing apparatus 580 shown in FIG. 58, and the configuration and operation of the corresponding point selection unit 5601 are different.
- the operation of the corresponding point selection unit 5601 is the same as that of the modified example 21, and thus the description thereof is omitted.
- FIG. 61 is a flowchart showing the operation of this embodiment. The operation of Modification Example 23 will be described in detail with reference to FIG. Note that steps S6101 to S6103 and S6107 shown in FIG. 61 are the same as steps S301 to S303 and S306 shown in FIG.
- the corresponding point normalized relative scale calculation unit 601 After executing the processing of step S6101 to step S6103 shown in FIG. 61, the corresponding point normalized relative scale calculation unit 601 performs relative processing based on the first scale information group, the second scale information group, and the corresponding point information group. The scale relationship is calculated, and the calculated relative scale relationship is normalized based on the image size or the actual subject size.
- the corresponding point relative direction calculation unit 1003 calculates corresponding point relative direction information based on the first direction information group, the second direction information group, and the corresponding point information group (step S6104).
- the relative coordinate value calculation unit 4202 uses the corresponding point relative scale information group, the corresponding point relative direction information group, the reference point coordinate value, the first coordinate value information group, and the second coordinate value information group.
- a relative coordinate value is calculated (step S6105).
- the corresponding point selection unit 5601 is based on the corresponding point relative scale information group, the corresponding point relative direction information group, the relative coordinate value information group, the relative scale range, the relative direction range, and the relative coordinate value range.
- the points are clustered, and feature points are selected based on the clustering result (step S6106). Then, the processing proceeds to step S6107 shown in FIG.
- the modified example 23 has the same effect as the modified example 22, and has a feature point having a relative scale value within the relative scale range, a feature point having relative direction information within the relative direction range, and a relative coordinate value range. Since only the feature points having the relative coordinate values are clustered, the calculation amount can be reduced as compared with the modified example 22.
- Mode 2 One or two or more first feature points are detected from the first image, and the first feature points are detected from a predetermined range including the detected first feature points.
- a first local feature quantity generation unit that calculates a corresponding first local feature quantity information group;
- a corresponding point calculation unit that calculates a correspondence relationship between the first feature point and a second feature point included in the second local feature amount information group calculated from the second image as corresponding point information;
- Corresponding point relative direction calculation unit that calculates the relative relationship of Clustering at least one feature point of the first feature point and the second feature point based on the corresponding point relative direction information, and selecting at least one feature point based on the clustering result
- a determination unit that compares the first image with the second image for each cluster based on the feature points selected by the corresponding point selection unit, and determines the identity of the subject;
- Mode 4 In any one of modes 1 to 3, the corresponding point selection unit clusters the first feature points based on the corresponding point relative information and the coordinate values of the first feature points.
- Mode 5 In any one of modes 1 to 4, the corresponding point selection unit generates selection information in which the corresponding point information of the selected feature point is associated with one or more pieces of cluster information. The image processing apparatus described.
- a reference point is selected from the second image, and based on the first feature point, the second feature point, and the corresponding point relative information, the first feature point;
- the image processing apparatus according to any one of Embodiments 1 to 5, further comprising a relative coordinate value calculation unit that calculates a relative relationship with the reference point as relative coordinate information.
- the corresponding point selection unit selects at least one feature point of the first feature point and the second feature point based on the corresponding point relative information and a predetermined corresponding point relative information range.
- the image processing device according to any one of forms 1 to 6, wherein clustering is performed.
- Mode 9 One or two or more first feature points are detected from the first image, and the first feature points are detected from a predetermined range including the detected first feature points. Calculating a corresponding first local feature quantity information group; Calculating a correspondence relationship between the first feature point and a second feature point included in the second local feature quantity information group calculated from the second image as corresponding point information; Based on the first local feature quantity information group, the second local feature quantity information group, and the corresponding point information, the direction of the first feature point, the direction of the second feature point, Calculating the relative relationship of the corresponding point relative direction information, Clustering at least one feature point of the first feature point and the second feature point based on the corresponding point relative direction information, and selecting at least one feature point based on a clustering result; Comparing the first image and the second image for each cluster based on the selected feature points to determine the identity of the subject;
- a subject identification method including:
- Mode 11 A program that is executed by a computer that controls an image processing apparatus, One or more first feature points are detected from the first image, and a first range corresponding to each first feature point is detected from a predetermined range including the detected first feature points.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Probability & Statistics with Applications (AREA)
- Image Analysis (AREA)
Abstract
Description
本発明は、日本国特許出願:特願2013-268852号(2013年12月26日出願)の優先権主張に基づくものであり、同出願の全記載内容は引用をもって本書に組み込み記載されているものとする。
本発明は、画像処理装置、被写体識別方法及びプログラムに関する。
さらに、前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する対応点算出が備えられる。
さらに、前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点のスケールと、前記第2の特徴点のスケールとの相対関係を、対応点相対スケール情報として算出する対応点相対スケール算出部が備えられる。
さらに、前記対応点相対スケール情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する対応点選定部が備えられる。
さらに、前記対応点選定部が選定した特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する判定部が備えられる。
第1の画像から、1又は2以上の第1の特徴点を検出し、検出した前記各第1の特徴点を含む所定の範囲の領域から、前記各第1の特徴点に対応する、第1の局所特徴量情報群を算出する工程と、
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する工程と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点のスケールと、前記第2の特徴点のスケールとの相対関係を、対応点相対スケール情報として算出する工程と、を含む。
さらに、該方法は、前記対応点相対スケール情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する工程と、
選定された特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する工程と、
を含む。
なお、本方法は、第1の画像の被写体と、第2の画像の被写体の同一性を判定する画像処理装置という、特定の機械に結び付けられている。
第1の画像から、1又は2以上の第1の特徴点を検出し、検出した前記各第1の特徴点を含む所定の範囲の領域から、前記各第1の特徴点に対応する、第1の局所特徴量情報群を算出する処理と、
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する工程と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点のスケールと、前記第2の特徴点のスケールとの相対関係を、対応点相対スケール情報として算出する処理と、を含む。
さらに、該プログラムは、前記対応点相対スケール情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する処理と、
選定された特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する処理と、を含む。
なお、本プログラムは、コンピュータが読み取り可能な記憶媒体に記録することができる。記憶媒体は、半導体メモリ、ハードディスク、磁気記録媒体、光記録媒体等の非トランジェント(non-transient)なものとすることができる。本発明は、コンピュータプログラム製品として具現することも可能である。
第1の実施形態について、図面を用いて詳細に説明する。
図2を参照して、実施形態1を説明する。図2は、実施形態1に係る画像処理装置20の機能構成を示すブロック図である。図2に示すように、画像処理装置20は、第1の局所特徴量生成部201と、第2の局所特徴量生成部202と、対応点算出部203と、対応点相対スケール算出部204と、対応点選定部205と、判定部206を含んで構成される。ここで、図2は、画像処理装置20を、図2に示す構成に限定する趣旨ではない。なお、以下の説明では、第1の画像は同一または類似の被写体を一つ以上含み、第2の画像には一の被写体を含むとして説明する。ただし、これは、第1の画像、及び第2の画像を、この条件に限定する趣旨ではない。
第1の局所特徴量生成部201は、第1の画像から、所定の条件を満たす数の特徴点を検出し、検出した各特徴点を含む周辺領域(近傍領域)の局所特徴量を生成する。例えば、特徴点の数が所定の閾値を超えることを、上記の所定の条件としても良い。
対応点算出部203は、第1の局所特徴量生成部201が出力した第1の局所特徴量情報群と、第2の局所特徴量生成部202が出力した第2の局所特徴量群とを用いて、所定の条件を満たす数の対応点情報を生成する。例えば、対応点情報の数が所定の閾値を超えることを、上記の所定の条件としても良い。
対応点相対スケール算出部204は、対応点算出部203が出力した対応点情報群と、第1の局所特徴量生成部201が出力した第1のスケール情報群と、第2の局所特徴量生成部202が出力した第2のスケール情報群を用いて、画像間で対応付けられた特徴点(以下、対応点と言う)のスケール値の相対関係を、所定の条件を満たす数を超えて算出する。そして、対応点相対スケール算出部204は、算出された相対関係で構成される対応点相対スケール情報群を、対応点選定部205に出力する。
[数2]
ここで、σ’は対応点のスケール値の差分値を、s’(q)は第1の画像から検出したq番目の特徴点のスケール値を、s’(p)は第2の画像から検出したp番目の特徴点のスケール値を示す。
対応点選定部205は、対応点相対スケール算出部204が出力した対応点相対スケール情報群を用いて、第1の画像に含まれる特徴点をクラスタリングする。そして、対応点選定部205は、クラスタリング結果に基づいて特徴点を選定する。具体的には、例えば、対応点選定部205は、任意のクラスタに含まれる対応点の数が所定の閾値以上ならば、当該クラスタに含まれる特徴点を選定するようにしても良い。
判定部206は、対応点選定部205が出力した選定情報群を用いて、クラスタ単位で、画像間で同一性または類似の被写体を判定する。例えば、選定された特徴点の数が、ある閾値を超える場合に、対象クラスタと第2の画像とが、同一(または類似)の被写体であると判定しても良い。そして、判定部206は、その判定結果を出力する。
また、判定部206は、例えば、対象クラスタに属する二つの対応点から成る対応点ペアの座標値から回転やスケール変化を推定し、対応点相対スケールと対応点の方向情報の差分値に基づいて回転やスケール変化を推定し、各々の方法で算出した回転やスケール変化を比較し、それらの類似性が十分に大きければ、対象クラスタと第2の画像とが、同一と判定しても良い。具体的には、対象クラスタから二つの対応点から成る対応点ペアを取り、対応点ペアの画像1に属する二つの特徴点を端点とする線分と、対応点ペアで画像2に属する二つの特徴点を端点とする線分と比較し、二つ線分の回転やスケールの変化を算出し、対応点相対スケールと対応点の方向情報の差分値に基づいて算出された回転やスケール変化と比較する。比較するときに、回転とスケール変化のいずれかまたは両方を使って比較しても良い。また、対応点ペアから対応点相対スケールと対応点の方向情報の差分値に基づいて算出された回転やスケールの変化値は二つ求まるため、それぞれの値を独立に比較しても良いし、いずれかの値を使って比較しても良いし、その平均値を使って比較しても良い。また、同一判定するとき、対象クラスタに属する任意の二つの対応点から成る対応点ペアに基づいて、同一判定しても良いし、対象クラスタからN個(N>1)の対応点ペアを上記の方法で判定し、同一と判定される対応点ペアが一定の割合以上を存在した場合は、対象クラスタと第2の画像とが、同一と判定しても良い。また、ここでNは、対象クラスタに属する全対応点ペアでも良いし、一部の対応点ペアでも良い。
また、判定部206は、例えば、対象クラスタに属する二つ以上の対応点の座標値から回転やスケール変化を推定し、対応点相対スケールと対応点の方向情報の差分値に基づいて回転やスケール変化を推定し、各々の方法で算出した回転やスケール変化を比較し、それらの類似性が十分に大きければ、対象クラスタと第2の画像とが、同一と判定しても良い。具体的には、対象クラスタからN個の対応点ペアを取り、各対応点ペアで画像1に属する二つの特徴点を端点とする線分と、対応点ペアで画像2に属する二つの特徴点を端点とする線分と比較し、二つ線分から線分の回転やスケールの変化を算出し、対応点相対スケールと対応点の方向情報の差分値に基づいて算出された回転やスケール変化と比較する。比較するときに、回転とスケール変化のいずれかまたは両方を使って比較しても良い。また、N個の対応点ペアから線分の回転とスケールの変化を算出するときに、それぞれの平均値を使っても良いし、中央値でも良いし、任意の対応点ペアから算出した値でも良い。また、対応点相対スケールと対応点の方向情報の差分値に基づいて回転やスケール変化を推定するとき、N個の対応点相対スケールと対応点の方向情報の差分値の平均値でも良いし、中央値でも良いし、任意の一つの対応点から算出した値でも良い。また、ここでNは、対象クラスタに属する全対応点ペアでも良いし、一部の対応点ペアでも良い。
[数3]
ここで、σijは第1の画像のi番目の特徴点のスケール値と、第2の画像のj番目の特徴点のスケール値とを含んで構成される、対応点相対スケール値をしめす。ρijは、特徴点iと特徴点jの方向情報の差分値を示す。
次に、実施形態1の動作について詳細に説明する。
以上説明したように、実施形態1においては、画像処理装置10は、対応点相対スケール情報に基づいて、特徴点をクラスタリングすると共に、クラスタ単位で、画像間で同一または類似の被写体を判定する。画像間で正しく対応付けられた特徴点間の相対スケールは被写体ごとに一定となるため、対応点相対スケール情報に基づいて特徴点をクラスタリングすることにより、第1の画像の特徴点と、第2の画像の特徴点との誤った対応付けを削除することができる。これにより、画像間で同一または類似の被写体を精度良く識別することができる。
次に、第2の実施形態について、詳細に説明する。本実施形態は、特徴点のクラスタリングの際に、相対スケール範囲を限定して、特徴点をクラスタリングする形態である。以下の説明では、実施形態1と同一または類似の構成については同一の符号を振ると共に、適宜、説明を省略する。また、作用効果の記載についても、実施形態1と同様の場合には、適宜、説明を省略する。この点、実施形態3以降についても同様である。
図5は本実施形態の動作を示すフローチャートである。図5を用いて、本実施形態の動作を詳細に説明する。なお、図5に示すステップS501~ステップS503、及びステップS506は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
実施形態2は、実施形態1と同様の効果が得られるとともに、相対スケール範囲内の相対スケール情報を持つ特徴点だけをクラスタリングするため、実施形態1より計算量を低減できる。
次に、実施形態3について、詳細に説明する。本実施形態は、対応点相対スケールを正規化する形態である。
[数4]
ここで、σ正規値は正規化された対応点相対スケール値を、σ(n)はn番目の対応点(例えば、第1の画像のq番目の特徴点と第2の画像のp番目の特徴点)の相対スケール値を、zは正規化値を示す。
図7は本実施形態の動作を示すフローチャートである。図7を用いて、本実施形態の動作を詳細に説明する。なお、図7に示すステップS701~ステップS703、及びステップS706は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
実施形態3は、実施形態1と同様の効果が得られるとともに、画像間で正しく対応付けられた特徴点間の相対スケールを正規化すると、正規化した対応点相対スケールは第1の画像内の全ての被写体で一定となる。そのため、対応点の正規化相対スケール値に基づいて特徴点をクラスタリングすることにより、例えば、ロゴなどの一部の模様が同じでサイズが異なる被写体同士の特徴点を対応付けるなど、第1の画像の特徴点と第2の画像の特徴点の誤った対応付けを削除することができる。これにより、画像間でサイズが異なる被写体が含まれていても、実施形態1よりもそれらを精度良く識別することができる。
次に、実施形態4について、詳細に説明する。本実施形態は、特徴点の相対方向関係に基づいて、クラスタリングする形態である。
第1の局所特徴量生成部1001は、第1の画像から、所定の条件を満たす数の特徴点を検出する。そして、第1の局所特徴量生成部1001は、検出した各特徴点の座標値から、特徴点を含む周辺領域(近傍領域)の局所特徴量を生成する。そして、第1の局所特徴量生成部1001は、生成した局所特徴量で構成される第1の局所特徴量群を対応点算出部203に出力する。
[数5]
ここで、ρは、対応点の方向情報の差分値を示す。θ(q)は、第1の画像から検出したq番目の特徴点の方向情報を示す。θ(p)は、第2の画像から検出したp番目の特徴点の方向情報を示す。
[数6]
ここで、ρ’は対応点の方向情報の比率を示す。
図11は本実施形態の動作を示すフローチャートである。図11を用いて、本実施形態の動作を詳細に説明する。なお、図11に示すステップS1101~ステップS1103、及びステップS1106は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
実施形態4は、対応点相対方向情報に基づいて特徴点をクラスタリングすると共に、クラスタ単位で、画像間で同一または類似の被写体を判定する。画像間で正しく対応付けられた特徴点間の相対方向は被写体ごとに一定となるため、対応点相対方向情報に基づいて特徴点をクラスタリングすることにより、第1の画像の特徴点と第2の画像の特徴点の誤った対応付けを削除することができる。これにより、画像間で同一または類似の被写体を精度良く識別することができる。
次に、実施形態5について、詳細に説明する。本実施形態は、対応点相対スケール情報、及び対応点相対方向情報をクラスタリングに用いる形態である。
第1の局所特徴量生成部1401は、第1の局所特徴量群と、第1のスケール情報群とを生成する。そして、第1の局所特徴量生成部1401は、生成した第1の局所特徴量群を、対応点算出部203に出力する、また、第1の局所特徴量生成部1401は、生成した第1のスケール情報群を、対応点相対スケール算出部204に出力する。また、第1の局所特徴量生成部1401は、第1の方向情報群を対応点相対方向算出部1003に出力する。
対応点選定部1403は、対応点相対スケール算出部204が出力した対応点相対スケール情報群と、対応点相対方向算出部1003が出力した対応点相対方向情報群とを用いて、第1の画像に含まれる特徴点をクラスタリングする。
図15は本実施形態の動作を示すフローチャートである。図15を用いて、本実施形態の動作を詳細に説明する。なお、図15に示すステップS1501~ステップS1503、及びステップS1506は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
実施形態5は、実施形態1と同様の効果が得られるとともに、画像間で正しく対応付けられた特徴点間の相対方向は被写体ごとに一定となる。そして、実施形態5は、相対方向情報に基づいて特徴点をクラスタリングすることにより、第1の画像の特徴点と第2の画像の特徴点の誤った対応付けを削除することができる。これにより、実施形態5は、実施形態1よりも認識精度を向上することができる。
次に、実施形態6について、詳細に説明する。本実施形態は、特徴点の座標値をクラスタリングに用いる形態である。
第1の局所特徴量生成部2201は、実施形態1に係る第1の局所特徴量生成部201と同様の動作により、第1の局所特徴量群を生成し、対応点算出部203に出力する。また、第1の局所特徴量生成部2201は、特徴点から生成した第1のスケール情報群を対応点相対スケール算出部204に出力する。また、第1の局所特徴量生成部2201は、第1の画像から検出した特徴点の座標値で構成される第1の座標値情報群を対応点選定部2202に出力する。
対応点選定部2202は、対応点相対スケール算出部204が出力した対応点相対スケール情報群と、第1の局所特徴量生成部2201が出力した第1の座標値情報群とを用いて、第1の画像に含まれる特徴点をクラスタリングする。そして、対応点選定部2202は、クラスタリング結果に基づいて特徴点を選定し、選定情報群を出力する。特徴点の選定、及び選定情報群の出力については、上記の実施形態と同様であるため、詳細な説明は省略する。
図23は本実施形態の動作を示すフローチャートである。図23を用いて、本実施形態の動作を詳細に説明する。なお、図23に示すステップS2301~ステップS2303、及びステップS2306は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
実施形態6は、実施形態1と同様の効果が得られるとともに、画像内の特徴点の座標値に基づいてそれらをクラスタリングし、クラスタ単位で、画像間で同一または類似の被写体を判定する。そのため、実施形態6は、画像間で正しく対応付けられた特徴点は被写体ごとに密集するため、座標値情報に基づいて特徴点をクラスタリングすることにより、同一スケールの被写体が複数存在する場合であっても、実施形態1よりもそれらを個別に認識できる。
次に、実施形態7について詳細に説明する。実施形態7は、特徴点の相対座標値を算出し、クラスタリングに用いる形態である。
第1の局所特徴量生成部3401は、第1の局所特徴量群と第1の方向情報群と第1の座標値情報群とを生成し、それぞれ、対応点算出部203と対応点相対方向算出部1003と対応点選定部4203とに出力する。また、第1の局所特徴量生成部3401は、図2に示す第1の局所特徴量生成部201と同様の動作により生成した第1のスケール情報群を対応点相対スケール算出部204に出力する。
相対座標値算出部4202は、対応点相対スケール算出部204が出力した対応点相対スケール情報群と、対応点相対方向算出部1003が出力した対応点相対方向情報群と、第1の局所特徴量生成部3401が出力した第1の座標値情報群と、第2の局所特徴量生成部4201が出力した第2の座標値情報群と、基準点座標値とを用いて、第1の画像の特徴点の座標値を任意の座標系の点に変換する。そして、相対座標値算出部4202は、変換された座標値(以下、対応点相対座標値と呼ぶ)を、対応点選定部4203に出力する。ここで、基準点とは、あらかじめ定められた座標値であり、例えば、第2の画像と同じデカルト座標系内の任意の点でも良い。以下では、被写体中心点を基準点として説明する。
[数7]
ここで、i及びjは、それぞれ、第1の画像及び第2の画像の特徴点番号を、viは第1の画像のi番目の特徴点の座標値を、σijは対応点相対スケールを、ρijは対応点相対方向を、cijは第1の画像中の被写体中心の座標値を、uj’は第2の画像のj番目の特徴点から第2の画像の被写体中心点へのベクトルを示している。ベクトルは、例えば、以下の式、数8のように算出してもよい。
[数8]
ここで、xjはj番目の特徴点のx座標値を、yjはj番目の特徴点のy座標値を、xcは選定された、第2の画像の基準点のx座標値を、ycは第2の画像の基準点のy座標値を示す。
対応点選定部4203は、相対座標値算出部4202が出力した相対座標値情報群と、対応点相対スケール算出部204が出力した対応点相対スケール情報群とを用いて、第1の画像に含まれる特徴点をクラスタリングする。そして、対応点選定部4203は、クラスタリング結果に基づいて特徴点を選定し、選定情報群を出力する。特徴点の選定、及び選定情報群の出力については、上記の実施形態と同様であるため、詳細な説明は省略する。
この場合、対応点相対座標値の算出には、例えば、以下の式、数9を用いても良い。
[数9]
ここで、vjjは相対移動量を、vjは第2の画像のj番目の特徴点の座標値を示している。
図43は本実施形態の動作を示すフローチャートである。図43を用いて、本実施形態の動作を詳細に説明する。なお、図43に示すステップS4301~ステップS4303、及びステップS4306は、上記のステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
実施形態7は、実施形態1と同様の効果が得られるとともに、第1の画像の各特徴点を被写体中心に集めてから、それらをクラスタリングするため、実施形態1よりも特徴点を精度良くクラスタリングできる。従って、実施形態7は、画像内の同一または類似の被写体を実施形態1よりも精度良く識別できる。
図8を参照して、変形例1を説明する。図8は、変形例1に係る画像処理装置80の機能構成を示すブロック図である。図8に示すように、画像処理装置80は、図6に示す画像処理装置60と同様の構成であり、対応点選定部401の構成と動作が異なる。また、対応点選定部401は実施形態2と同様の構成と動作であり、ここでの説明を省略する。
図9は変形例1の動作を示すフローチャートである。図9を用いて、変形例1の動作を詳細に説明する。なお、図9に示すステップS901~ステップS903、及びステップS906は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例1は、実施形態3と同様の効果が得られるとともに、相対スケール範囲内の正規化相対スケール情報を持つ特徴点だけをクラスタリングするため、実施形態3より計算量を低減できる。
次に、図12を参照して、変形例2を説明する。図12は、変形例2に係る画像処理装置120の機能構成を示すブロック図である。図12に示すように、画像処理装置120は、図10に示す画像処理装置100と同様の構成であり、対応点選定部1201の構成と動作が異なる。
図13は変形例2の動作を示すフローチャートである。図13を用いて、変形例2の動作を詳細に説明する。なお、図13に示すステップS1301~ステップS1303、及びステップS1306は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例2は、実施形態4と同様の効果が得られるとともに、相対方向範囲内の方向情報を持つ特徴点だけをクラスタリングするため、実施形態4より計算量を低減できる。
次に、図16を参照して、変形例3を説明する。図16は、変形例3に係る画像処理装置160の機能構成を示すブロック図である。図16に示すように、画像処理装置160は、図14に示す画像処理装置140と同様の構成であり、対応点選定部1601の構成と動作が異なる。
図17は変形例3の動作を示すフローチャートである。図17を用いて、変形例3の動作を詳細に説明する。なお、図17に示すステップS1701~ステップS1703、及びステップS1706は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例3は、実施形態5と同様の効果が得られるとともに、相対スケール範囲内の対応点相対スケール情報と相対方向範囲内の対応点相対方向情報を持つ特徴点だけをクラスタリングするため、実施形態5より計算量を低減できる。
次に、図18を参照して、変形例4を説明する。図18は、変形例4に係る画像処理装置180の機能構成を示すブロック図である。図18に示すように、画像処理装置180は、図14に示す画像処理装置140と同様の構成であり、対応点正規化相対スケール算出部601の構成と動作が異なる。また、対応点正規化相対スケール算出部601の動作は、変形例1と同様であるため説明を省略する。
図19は変形例4の動作を示すフローチャートである。図19を用いて、変形例4の動作を詳細に説明する。なお、図19に示すステップS1901~ステップS1903、及びステップS1906は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例4は、実施形態5と同様の効果が得られるともに。また、画像間で正しく対応付けられた特徴点間の相対スケールを正規化すると、正規化した対応点相対スケールは第1の画像内の全ての被写体で一定となるため、実施形態5よりもそれらを精度良く識別することができる。
次に、図20を参照して、変形例5を説明する。図20は、変形例5に係る画像処理装置200の機能構成を示すブロック図である。図20に示すように、画像処理装置200は、図18に示す画像処理装置180と同様の構成であり、対応点選定部1601の構成と動作が異なる。また、対応点選定部1601の動作は、変形例3と同様であるため説明を省略する。
変形例5は、変形例4と同様の効果が得られるとともに、相対スケール範囲内の正規化相対スケール値を持つ特徴点と、相対方向範囲内の方向情報を持つ特徴点とだけをクラスタリングするため、変形例4より計算量を低減できる。
次に、図24を参照して、変形例6を説明する。図24は、変形例6に係る画像処理装置240の機能構成を示すブロック図である。図24に示すように、画像処理装置240は、図22に示す画像処理装置220と同様の構成であり、対応点選定部2401の構成と動作が異なる。
図25は変形例6の動作を示すフローチャートである。図25を用いて、変形例6の動作を詳細に説明する。なお、図25に示すステップS2501~ステップS2503、及びステップS2506は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例6は、実施形態6と同様の効果が得られるとともに、相対スケール範囲内の相対スケール値を持つ特徴点と、相対座標値範囲内の座標値を持つ特徴点とだけをクラスタリングするため、実施形態6より計算量を低減できる。
次に、図26を参照して、変形例7を説明する。図26は、変形例7に係る画像処理装置260の機能構成を示すブロック図である。図26に示すように、画像処理装置260は、図22に示す画像処理装置220と同様の構成であり、対応点正規化相対スケール算出部601の構成と動作が異なる。また、対応点正規化相対スケール算出部601の動作は、変形例1と同様であるため説明を省略する。
図27は変形例7の動作を示すフローチャートである。図27を用いて、変形例7の動作を詳細に説明する。なお、図27に示すステップS2701~ステップS2703、及びステップS2706は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例7は、実施形態6と同様の効果が得られるとともに、画像間で正しく対応付けられた特徴点間の相対スケールを正規化すると、正規化した対応点相対スケールは第1の画像内の全ての被写体で一定となるため、実施形態6よりもそれらを精度良く識別することができる。
次に、図28を参照して、変形例8を説明する。図28は、変形例8に係る画像処理装置280の機能構成を示すブロック図である。図28に示すように、画像処理装置280は、図26に示す画像処理装置260と同様の構成であり、対応点選定部2401の構成と動作が異なる。また、対応点選定部2401の動作は、変形例6と同様であるため説明を省略する。
図29は本実施形態の動作を示すフローチャートである。図29を用いて、本実施形態の動作を詳細に説明する。なお、図29に示すステップS2901~ステップS2903、及びステップS2906は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例8は、変形例7と同様の効果が得られるとともに正規化相対スケール範囲内の相対スケール値を持つ特徴点と、特徴点座標範囲内の座標情報を持つ特徴点とだけをクラスタリングするため、変形例7より計算量を低減できる。
次に、図30を参照して、変形例9を説明する。図30は、変形例9に係る画像処理装置300の機能構成を示すブロック図である。図30に示すように、画像処理装置300は、図10に示す画像処理装置100と同様の構成であり、第1の局所特徴量生成部3001、及び対応点選定部3002の構成と動作が異なる。
図31は変形例9の動作を示すフローチャートである。図31を用いて、変形例9の動作を詳細に説明する。なお、図31に示すステップS3101~ステップS3103、及びステップS3106は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例9は、実施形態4と同様の効果が得られるとともに、画像間で正しく対応付けられた特徴点は被写体ごとに密集するため、座標値に基づいて特徴点をクラスタリングすることにより、相対方向が同じ被写体が複数存在する場合でも実施形態4よりもそれらを個別に認識できる。
次に、図32を参照して、変形例10を説明する。図32は、変形例10に係る画像処理装置320の機能構成を示すブロック図である。図32に示すように、画像処理装置320は、図30に示す画像処理装置300と同様の構成であり、対応点選定部3201の構成と動作が異なるため、対応点選定部3201について説明する。
図33は変形例10の動作を示すフローチャートである。図33を用いて、変形例10の動作を詳細に説明する。なお、図33に示すステップS3301~ステップS3303、及びステップS3306は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例10は、変形例9と同様の効果が得られるとともに、相対方向範囲内の相対方向情報を持つ特徴点と、相対座標値範囲内の座標値を持つ特徴点とだけをクラスタリングするため、変形例9より計算量を低減できる。
次に、図34を参照して、変形例11を説明する。図34は、変形例11に係る画像処理装置340の機能構成を示すブロック図である。図34に示すように、画像処理装置340は、図14に示す画像処理装置140と同様の構成であり、第1の局所特徴量生成部3401、及び対応点選定部3402の構成と動作が異なる。
図35は本実施形態の動作を示すフローチャートである。図35を用いて、本実施形態の動作を詳細に説明する。なお、図35に示すステップS3501~ステップS3503、及びステップS3506は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例11は、実施形態5と同様の効果が得られるとともに、画像内の特徴点の座標値に基づいてそれらをクラスタリングし、クラスタ単位で、画像間で同一または類似の被写体を判定する。画像間で正しく対応付けられた特徴点は被写体ごとに密集するため、座標値情報に基づいて特徴点をクラスタリングすることにより、変形例11は、同一スケールまたは相対方向が同じ被写体が複数存在する場合でも実施形態5よりもそれらを個別に認識できる。
次に、図36を参照して、変形例12を説明する。図36は、変形例12に係る画像処理装置360の機能構成を示すブロック図である。図36に示すように、画像処理装置360は、図34に示す画像処理装置340と同様の構成であり、対応点選定部3601の構成と動作が異なる。
図37は変形例12の動作を示すフローチャートである。図37を用いて、変形例12の動作を詳細に説明する。なお、図37に示すステップS3701~ステップS3703、及びステップS3706は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例12は、変形例11と同様の効果が得られるとともに、相対スケール範囲内の相対スケール情報を持つ特徴点と、相対方向範囲内の相対方向情報を持つ特徴点と、相対座標値範囲内の座標値情報を持つ特徴点だけをクラスタリングするため、変形例11より計算量を低減できる。
次に、図38を参照して、変形例13を説明する。図38は、変形例13に係る画像処理装置380の機能構成を示すブロック図である。図38に示すように、画像処理装置380は、図34に示す画像処理装置340と同様の構成であり、対応点正規化相対スケール算出部601の構成と動作が異なる。また、対応点正規化相対スケール算出部601の動作は、第14の実施形態と同様であるため説明を省略する。
図39は変形例13の動作を示すフローチャートである。図39を用いて、変形例13の動作を詳細に説明する。なお、図39に示すステップS3901~ステップS3903、及びステップS3906は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例13は、変形例11と同様の効果が得られるとともに、また、画像間で正しく対応付けられた特徴点間の相対スケールを正規化すると、正規化した対応点相対スケールは第1の画像内の全ての被写体で一定となるため、変形例11よりもそれらを精度良く識別することができる。
次に、図40を参照して、変形例14を説明する。図40は、変形例14に係る画像処理装置400の機能構成を示すブロック図である。図40に示すように、画像処理装置400は、図38に示す画像処理装置380と同様の構成であり、対応点選定部3601の構成と動作が異なる。また、対応点選定部3601の動作は、変形例12と同様であるため説明を省略する。
図41は変形例14の動作を示すフローチャートである。図41を用いて、変形例14の動作を詳細に説明する。なお、図41に示すステップS4101~ステップS4103、及びステップS4106は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例14は、変形例13と同様の効果が得られるとともに、相対スケール範囲内の正規化相対スケール値を持つ特徴点と、相対方向範囲内の方向情報を持つ特徴点と、相対座標値範囲内にある特徴点とだけをクラスタリングするため、変形例13より計算量を低減できる。
次に、図44を参照して、変形例15を説明する。図44は、変形例15に係る画像処理装置440の機能構成を示すブロック図である。図44に示すように、画像処理装置440は、図42に示す画像処理装置420と同様の構成であり、対応点選定部4401の構成と動作が異なる。
図45は変形例15の動作を示すフローチャートである。図45を用いて、変形例15の動作を詳細に説明する。なお、図45に示すステップS4501~ステップS4503、及びステップS4507は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
次に、図46を参照して、変形例16を説明する。図46は、変形例16に係る画像処理装置460の機能構成を示すブロック図である。図46に示すように、画像処理装置460は、図42に示す画像処理装置420と同様の構成であり、対応点正規化相対スケール算出部601の構成と動作が異なる。また、対応点正規化相対スケール算出部601の動作は、変形例14と同様であるため説明を省略する。
図47は変形例16の動作を示すフローチャートである。図47を用いて、変形例16の動作を詳細に説明する。なお、図47に示すステップS4701~ステップS4703、及びステップS4707は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例16は、実施形態7と同様の効果が得られる。また、画像間で正しく対応付けられた特徴点間の相対スケールを正規化すると、正規化した対応点相対スケールは第1の画像内の全ての被写体で一定となる。そのため、変形例16は、対応点の正規化相対スケール値に基づいて特徴点をクラスタリングすることにより、画像間でサイズが異なる被写体が含まれていても、実施形態7よりもそれらを精度良く識別することができる。
次に、図48を参照して、変形例17を説明する。図48は、変形例17に係る画像処理装置480の機能構成を示すブロック図である。図48に示すように、画像処理装置480は、図46に示す画像処理装置460と同様の構成であり、対応点選定部4401の構成と動作が異なる。また、対応点選定部4401は、変形例16と同様であるため説明を省略する。
図49は変形例17の動作を示すフローチャートである。図49を用いて、変形例17の動作を詳細に説明する。なお、図49に示すステップS4901~ステップS4903、及びステップS4907は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例17は、変形例15と同様の効果が得られるとともに、正規化した対応点相対スケールは第1の画像内の全ての被写体で一定となるため、対応点の正規化相対スケール値に基づいて特徴点をクラスタリングすることにより、変形例16と同様の効果が得られる。また、本実施形態は、相対スケール範囲内の相対スケール値を持つ特徴点と、相対座標値範囲内の相対座標値を持つ特徴点とだけをクラスタリングするため、変形例16よりも計算量を低減できる。
次に、図50を参照して、変形例18を説明する。図50は、変形例18に係る画像処理装置500の機能構成を示すブロック図である。図50に示すように、画像処理装置500は、図34に示す画像処理装置340と同様の構成であり、第2の局所特徴量生成部4201、相対座標値算出部4202、及び対応点選定部5001の構成と動作が異なる。第2の局所特徴量生成部4201と相対座標値算出部4202の構成と動作は、変形例17と同様の構成であるため説明を省略する。
図51は変形例18の動作を示すフローチャートである。図51を用いて、変形例18の動作を詳細に説明する。なお、図51に示すステップS5101~ステップS5103、及びステップS5107は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例18は、変形例11と同様の効果が得られるとともに、第1の画像の各特徴点を被写体中心に集めてから、それらをクラスタリングするため、変形例11よりも特徴点を精度良くクラスタリングできる。従って、画像内の同一または類似の被写体を変形例11よりも精度良く識別できる。
次に、図52を参照して、変形例19を説明する。図52は、変形例19に示す画像処理装置520の機能構成を示すブロック図である。図52に示すように、画像処理装置520は、図50に示す画像処理装置500と同様の構成であり、対応点選定部5201の構成と動作が異なる。
図53は変形例19の動作を示すフローチャートである。図53を用いて、変形例19の動作を詳細に説明する。なお、図53に示すステップS5301~ステップS5303、及びステップS5307は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例19は、変形例18と同様の効果が得られるとともに、相対方向範囲内の相対方向値を持つ特徴点と、相対座標値範囲内の相対座標値を持つ特徴点とだけをクラスタリングするため、変形例18より計算量を低減できる。
次に、図54を参照して、変形例20を説明する。図54は、変形例20に係る画像処理装置540の機能構成を示すブロック図である。図54に示すように、画像処理装置540は、図42に示す画像処理装置420と同様の構成であり、対応点選定部5401の構成と動作が異なる。
図55は変形例20の動作を示すフローチャートである。図55を用いて、変形例20の動作を詳細に説明する。なお、図55に示すステップS5501~ステップS5503、及びステップS5507は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例20は、実施形態7と同様の効果が得られるとともに、画像間で正しく対応付けられた特徴点間の相対方向は被写体ごとに一定となる。そのため、対応点相対方向情報に基づいて特徴点をクラスタリングすることにより、第1の画像の特徴点と第2の画像の特徴点の誤った対応付けを削除することができる。これにより、本実施形態は、実施形態7よりも画像内の同一または類似の被写体を精度良く識別できる。
次に、図56を参照して、変形例21を説明する。図56は、変形例21に係る画像処理装置560の機能構成を示すブロック図である。図56に示すように、画像処理装置560は、図54に示す画像処理装置540と同様の構成であり、対応点選定部5601の構成と動作が異なる。
図57は変形例21の動作を示すフローチャートである。図57を用いて、変形例21の動作を詳細に説明する。なお、図57に示すステップS5701~ステップS5703、及びステップS5707は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例21は、変形例20と同様の効果が得られるとともに相対スケール範囲内の相対スケール値を持つ特徴点と、相対方向範囲内の方向情報を持つ特徴点と、相対座標値範囲内の相対座標値を持つ特徴点とだけをクラスタリングするため、変形例20より計算量を低減できる。
次に、図58を参照して、変形例22を説明する。図58は、変形例22に係る画像処理装置580の機能構成を示すブロック図である。図58に示すように、画像処理装置580は、図54に示す画像処理装置540と同様の構成であり、対応点正規化相対スケール算出部601の構成と動作が異なる。また、対応点正規化相対スケール算出部601の動作は、変形例17と同様であるため説明を省略する。
図59は変形例22の動作を示すフローチャートである。図59を用いて、変形例22の動作を詳細に説明する。なお、図59に示すステップS5901~ステップS5903、及びステップS5907は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例22は、変形例20と同様の効果が得られるとともに、画像間で正しく対応付けられた特徴点間の相対スケールを正規化すると、正規化した対応点相対スケールは第1の画像内の全ての被写体で一定となる。そのため、変形例22は、対応点の正規化相対スケール値に基づいて特徴点をクラスタリングすることにより、画像間でサイズが異なる被写体が含まれていても、変形例20よりもそれらを精度良く識別することができる。
次に、図60を参照して、変形例23を説明する。図60は、変形例23に係る画像処理装置600の機能構成を示すブロック図である。図60に示すように、画像処理装置600は、図58に示す画像処理装置580と同様の構成であり、対応点選定部5601の構成と動作が異なる。また、対応点選定部5601の動作は、変形例21と同様であるため説明を省略する。
図61は本実施形態の動作を示すフローチャートである。図61を用いて、変形例23の動作を詳細に説明する。なお、図61に示すステップS6101~ステップS6103、及びステップS6107は、図3に示すステップS301~ステップS303、及びステップS306と同一であるため、詳細な説明は省略する。
変形例23は、変形例22と同様の効果が得られるとともに、相対スケール範囲内の相対スケール値を持つ特徴点と、相対方向範囲内の相対方向情報を持つ特徴点と、相対座標値範囲内の相対座標値を持つ特徴点とだけをクラスタリングするため、変形例22より計算量を低減できる。
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する対応点算出部と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点の方向と、前記第2の特徴点の方向との相対関係を、対応点相対方向情報として算出する対応点相対方向算出部と、
前記対応点相対方向情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する対応点選定部と、
前記対応点選定部が選定した特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する判定部と、
を備える画像処理装置。
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する工程と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点の方向と、前記第2の特徴点の方向との相対関係を、対応点相対方向情報として算出する工程と、
前記対応点相対方向情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する工程と、
選定された特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する工程と、
を含む被写体識別方法。
第1の画像から、1又は2以上の第1の特徴点を検出し、検出した前記各第1の特徴点を含む所定の範囲の領域から、前記各第1の特徴点に対応する、第1の局所特徴量情報群を算出する処理と、
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する工程と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点の方向と、前記第2の特徴点の方向との相対関係を、対応点相対方向情報として算出する処理と、
前記対応点相対方向情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する処理と、
選定された特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する処理と、
を画像処理装置を制御するコンピュータに実行させるプログラム。
11 第1の局所特徴量生成部
13 対応点算出部
14 対応点相対スケール算出部
15 対応点選定部
16 判定部
101、201、1001、1401、2201、3001、3401 第1の局所特徴量生成部
102、202、1002、1402、4201 第2の局所特徴量生成部
103、203 対応点算出部
104、206 判定部
204 対応点相対スケール算出部
205、401、1004、1201、1403、1601、2202、2401、3002、3201、3402、3601、4203、4401、5001、5201、5401、5601 対応点選定部
601 対応点正規化相対スケール算出部
1003 対応点相対方向算出部
4202 相対座標値算出部
10001、10002、10011、10012 画像
Claims (10)
- 第1の画像から、1又は2以上の第1の特徴点を検出し、検出した前記各第1の特徴点を含む所定の範囲の領域から、前記各第1の特徴点に対応する、第1の局所特徴量情報群を算出する、第1の局所特徴量生成部と、
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する対応点算出部と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点のスケールと、前記第2の特徴点のスケールとの相対関係を、対応点相対スケール情報として算出する対応点相対スケール算出部と、
前記対応点相対スケール情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する対応点選定部と、
前記対応点選定部が選定した特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する判定部と、
を備える画像処理装置。 - 第1の画像から、1又は2以上の第1の特徴点を検出し、検出した前記各第1の特徴点を含む所定の範囲の領域から、前記各第1の特徴点に対応する、第1の局所特徴量情報群を算出する、第1の局所特徴量生成部と、
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する対応点算出部と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点の方向と、前記第2の特徴点の方向との相対関係を、対応点相対方向情報として算出する対応点相対方向算出部と、
前記対応点相対方向情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する対応点選定部と、
前記対応点選定部が選定した特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する判定部と、
を備える画像処理装置。 - 前記対応点相対情報算出部は、前記対応点相対スケール情報を正規化する、請求項1に記載の画像処理装置。
- 前記対応点選定部は、前記対応点相対情報、及び前記第1の特徴点の座標値に基づいて、前記第1の特徴点をクラスタリングする、請求項1乃至3のいずれか一に記載の画像処理装置。
- 前記対応点選定部は、選定した特徴点の前記対応点情報と、1又は2以上のクラスタ情報と、を対応付けた選定情報を生成する、請求項1乃至4のいずれか一に記載の画像処理装置。
- 前記第2の画像から基準点を選択し、前記第1の特徴点と、前記第2の特徴点と、前記対応点相対情報とに基づいて、前記第1の特徴点と、前記基準点との相対関係を、相対座標情報として算出する、相対座標値算出部をさらに備える請求項1乃至5のいずれか一に記載の画像処理装置。
- 前記対応点選定部は、前記対応点相対情報と、所定の対応点相対情報範囲とに基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングする、請求項1乃至6のいずれか一に記載の画像処理装置。
- 第1の画像から、1又は2以上の第1の特徴点を検出し、検出した前記各第1の特徴点を含む所定の範囲の領域から、前記各第1の特徴点に対応する、第1の局所特徴量情報群を算出する工程と、
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する工程と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点のスケールと、前記第2の特徴点のスケールとの相対関係を、対応点相対スケール情報として算出する工程と、
前記対応点相対スケール情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する工程と、
選定された特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する工程と、
を含む被写体識別方法。 - 第1の画像から、1又は2以上の第1の特徴点を検出し、検出した前記各第1の特徴点を含む所定の範囲の領域から、前記各第1の特徴点に対応する、第1の局所特徴量情報群を算出する工程と、
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する工程と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点の方向と、前記第2の特徴点の方向との相対関係を、対応点相対方向情報として算出する工程と、
前記対応点相対方向情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する工程と、
選定された特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する工程と、
を含む被写体識別方法。 - 第1の画像から、1又は2以上の第1の特徴点を検出し、検出した前記各第1の特徴点を含む所定の範囲の領域から、前記各第1の特徴点に対応する、第1の局所特徴量情報群を算出する処理と、
前記第1の特徴点と、第2の画像から算出した第2の局所特徴量情報群に含まれる第2の特徴点との対応関係を、対応点情報として算出する工程と、
前記第1の局所特徴量情報群と、前記第2の局所特徴量情報群と、前記対応点情報とに基づいて、前記第1の特徴点のスケールと、前記第2の特徴点のスケールとの相対関係を、対応点相対スケール情報として算出する処理と、
前記対応点相対スケール情報に基づいて、前記第1の特徴点、前記第2の特徴点の少なくとも一の特徴点をクラスタリングし、クラスタリング結果に基づいて、少なくとも一の特徴点を選定する処理と、
選定された特徴点に基づいて、クラスタ毎に、前記第1の画像と、前記第2の画像を比較し、被写体の同一性を判定する処理と、
を画像処理装置を制御するコンピュータに実行させるプログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/100,694 US9798955B2 (en) | 2013-12-26 | 2014-12-25 | Image processing apparatus, photographic subject identifying method and program |
EP14874117.6A EP3089108B1 (en) | 2013-12-26 | 2014-12-25 | Image processing device, subject identification method and program |
JP2015554986A JP6459981B2 (ja) | 2013-12-26 | 2014-12-25 | 画像処理装置、被写体識別方法及びプログラム |
CN201480071107.3A CN105849776A (zh) | 2013-12-26 | 2014-12-25 | 图像处理装置、主题识别方法和程序 |
KR1020167020030A KR101822317B1 (ko) | 2013-12-26 | 2014-12-25 | 화상 처리 장치, 피사체 식별 방법 및 프로그램 |
HK16112233.4A HK1224069A1 (zh) | 2013-12-26 | 2016-10-25 | 圖像處理裝置、主題識別方法和程式 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-268852 | 2013-12-26 | ||
JP2013268852 | 2013-12-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015099016A1 true WO2015099016A1 (ja) | 2015-07-02 |
Family
ID=53478863
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/084254 WO2015099016A1 (ja) | 2013-12-26 | 2014-12-25 | 画像処理装置、被写体識別方法及びプログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US9798955B2 (ja) |
EP (1) | EP3089108B1 (ja) |
JP (1) | JP6459981B2 (ja) |
KR (1) | KR101822317B1 (ja) |
CN (1) | CN105849776A (ja) |
HK (1) | HK1224069A1 (ja) |
WO (1) | WO2015099016A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017179728A1 (ja) * | 2016-04-14 | 2017-10-19 | シャープ株式会社 | 画像認識装置、画像認識方法および画像認識プログラム |
JPWO2017006852A1 (ja) * | 2015-07-06 | 2017-12-28 | 日本電信電話株式会社 | 画像照合装置、画像照合方法、及びプログラム |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6593327B2 (ja) * | 2014-05-07 | 2019-10-23 | 日本電気株式会社 | 画像処理装置、画像処理方法およびコンピュータ可読記録媒体 |
KR101647691B1 (ko) * | 2016-02-12 | 2016-08-16 | 데이터킹주식회사 | 하이브리드 기반의 영상 클러스터링 방법 및 이를 운용하는 서버 |
CN110941989A (zh) * | 2019-10-18 | 2020-03-31 | 北京达佳互联信息技术有限公司 | 图像校验、视频校验方法、装置、设备及存储介质 |
KR102395166B1 (ko) * | 2021-10-29 | 2022-05-09 | 주식회사 딥노이드 | 특징 좌표 기반의 유사도 산출 장치 및 방법 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6711293B1 (en) | 1999-03-08 | 2004-03-23 | The University Of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
JP2004341940A (ja) | 2003-05-16 | 2004-12-02 | Fujitsu Ltd | 類似画像検索装置、類似画像検索方法、および類似画像検索プログラム |
JP2009163682A (ja) * | 2008-01-10 | 2009-07-23 | Toyota Central R&D Labs Inc | 画像識別装置及びプログラム |
JP2011113197A (ja) * | 2009-11-25 | 2011-06-09 | Kddi Corp | 画像検索方法およびシステム |
JP2012230501A (ja) * | 2011-04-25 | 2012-11-22 | Canon Inc | 画像処理装置、画像処理方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4889351B2 (ja) | 2006-04-06 | 2012-03-07 | 株式会社トプコン | 画像処理装置及びその処理方法 |
JP5290867B2 (ja) | 2009-05-25 | 2013-09-18 | キヤノン株式会社 | 画像検索装置およびその方法 |
JP2011221688A (ja) * | 2010-04-07 | 2011-11-04 | Sony Corp | 認識装置、認識方法、およびプログラム |
GB2487377B (en) | 2011-01-18 | 2018-02-14 | Aptina Imaging Corp | Matching interest points |
CN103065150A (zh) * | 2011-10-24 | 2013-04-24 | 康佳集团股份有限公司 | 基于智能移动终端的场景识别方法 |
EP2782067B1 (en) * | 2011-11-18 | 2019-09-18 | NEC Corporation | Local feature amount extraction device, local feature amount extraction method, and program |
CN102521838B (zh) * | 2011-12-19 | 2013-11-27 | 国家计算机网络与信息安全管理中心 | 一种图像匹配方法及系统 |
JP6280382B2 (ja) * | 2013-03-08 | 2018-02-14 | キヤノン株式会社 | 画像処理装置および画像処理方法 |
CN103456022B (zh) * | 2013-09-24 | 2016-04-06 | 中国科学院自动化研究所 | 一种高分辨率遥感图像特征匹配方法 |
-
2014
- 2014-12-25 CN CN201480071107.3A patent/CN105849776A/zh active Pending
- 2014-12-25 EP EP14874117.6A patent/EP3089108B1/en active Active
- 2014-12-25 US US15/100,694 patent/US9798955B2/en active Active
- 2014-12-25 WO PCT/JP2014/084254 patent/WO2015099016A1/ja active Application Filing
- 2014-12-25 KR KR1020167020030A patent/KR101822317B1/ko active IP Right Grant
- 2014-12-25 JP JP2015554986A patent/JP6459981B2/ja active Active
-
2016
- 2016-10-25 HK HK16112233.4A patent/HK1224069A1/zh unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6711293B1 (en) | 1999-03-08 | 2004-03-23 | The University Of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
JP2004341940A (ja) | 2003-05-16 | 2004-12-02 | Fujitsu Ltd | 類似画像検索装置、類似画像検索方法、および類似画像検索プログラム |
JP2009163682A (ja) * | 2008-01-10 | 2009-07-23 | Toyota Central R&D Labs Inc | 画像識別装置及びプログラム |
JP2011113197A (ja) * | 2009-11-25 | 2011-06-09 | Kddi Corp | 画像検索方法およびシステム |
JP2012230501A (ja) * | 2011-04-25 | 2012-11-22 | Canon Inc | 画像処理装置、画像処理方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3089108A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2017006852A1 (ja) * | 2015-07-06 | 2017-12-28 | 日本電信電話株式会社 | 画像照合装置、画像照合方法、及びプログラム |
US10572766B2 (en) | 2015-07-06 | 2020-02-25 | Nippon Telegraph And Telephone Corporation | Image collation device, image collation method, and program |
WO2017179728A1 (ja) * | 2016-04-14 | 2017-10-19 | シャープ株式会社 | 画像認識装置、画像認識方法および画像認識プログラム |
Also Published As
Publication number | Publication date |
---|---|
HK1224069A1 (zh) | 2017-08-11 |
US20160300122A1 (en) | 2016-10-13 |
KR20160103053A (ko) | 2016-08-31 |
US9798955B2 (en) | 2017-10-24 |
EP3089108A4 (en) | 2017-08-23 |
EP3089108B1 (en) | 2022-02-02 |
EP3089108A1 (en) | 2016-11-02 |
KR101822317B1 (ko) | 2018-01-25 |
JP6459981B2 (ja) | 2019-01-30 |
JPWO2015099016A1 (ja) | 2017-03-23 |
CN105849776A (zh) | 2016-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6459981B2 (ja) | 画像処理装置、被写体識別方法及びプログラム | |
WO2023138300A1 (zh) | 目标检测方法及应用其的移动目标跟踪方法 | |
US10878295B2 (en) | Method and apparatus for recognizing image | |
EP3186780B1 (en) | System and method for image scanning | |
US20070058856A1 (en) | Character recoginition in video data | |
NL2016542B1 (en) | Spatial data analysis. | |
CN111797709B (zh) | 一种基于回归检测的实时动态手势轨迹识别方法 | |
CN109146816A (zh) | 一种图像滤波方法、装置、电子设备和存储介质 | |
WO2014138622A1 (en) | Performance prediction for generation of point clouds from passive imagery | |
CN111476813A (zh) | 图像变化检测方法、装置、电子设备及存储介质 | |
CN110879982A (zh) | 一种人群计数系统及方法 | |
WO2022166363A1 (zh) | 一种基于近邻子空间划分高光谱影像波段选择方法及系统 | |
CN111862222A (zh) | 一种目标检测方法及电子设备 | |
CN117291936A (zh) | 点云分割方法、装置、设备和介质 | |
KR102366364B1 (ko) | 기하학적 패턴 매칭 방법 및 이러한 방법을 수행하는 장치 | |
CN117495891B (zh) | 点云边缘检测方法、装置和电子设备 | |
CN113918744A (zh) | 相似图像检索方法、装置、存储介质及计算机程序产品 | |
US20210082141A1 (en) | Image processing apparatus, image processing method, and computer-readable medium | |
CN111027609A (zh) | 一种图像数据加权分类方法和系统 | |
Steckenrider et al. | A probabilistic superpixel-based method for road crack network detection | |
CN112712123A (zh) | 匹配筛选方法、装置、电子设备和计算机可读存储介质 | |
CN112632601A (zh) | 面向地铁车厢场景的人群计数方法 | |
WO2015178001A1 (ja) | 画像照合システム、画像照合方法、およびプログラムを記憶する記録媒体 | |
Jumanov et al. | Optimization of recognition of microorganisms based on histological information structures of images | |
CN117953252B (zh) | 高速公路资产数据自动化采集方法及系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14874117 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15100694 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2015554986 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2014874117 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014874117 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20167020030 Country of ref document: KR Kind code of ref document: A |