WO2011089872A1 - 画像管理装置、画像管理方法、プログラム、記録媒体及び集積回路 - Google Patents
画像管理装置、画像管理方法、プログラム、記録媒体及び集積回路 Download PDFInfo
- Publication number
- WO2011089872A1 WO2011089872A1 PCT/JP2011/000150 JP2011000150W WO2011089872A1 WO 2011089872 A1 WO2011089872 A1 WO 2011089872A1 JP 2011000150 W JP2011000150 W JP 2011000150W WO 2011089872 A1 WO2011089872 A1 WO 2011089872A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cluster
- image
- importance
- occurrence
- feature amount
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/30—Scenes; Scene-specific elements in albums, collections or shared content, e.g. social network photos or video
Definitions
- the present invention relates to an image management technique, and more particularly to an image search technique for efficiently searching for a desired image from an enormous number of images.
- Patent Document 1 a method of ranking an image by evaluating the facial expression of a person included in each inputted photographed image (for example, Patent Document 1), a human face orientation
- Patent Document 2 a technique that ranks images by evaluating the shooting state of images based on predetermined setting conditions such as the degree of opening of eyes.
- An object of the present invention is to provide an image management apparatus and an image management method.
- an image management apparatus includes an image acquisition unit that acquires an image, and a distribution of pixel values of a plurality of pixels corresponding to an object (for example, a human face) included in the image in each image.
- An object detection unit that detects an object by extracting an object feature amount that is a feature amount according to a predetermined criterion, and each object detected in each image acquired by the image acquisition unit is an object of each object Object classification means for classifying into one of a plurality of clusters according to the feature amount, and for each object, the object importance that is the importance of the object based on the number of objects that belong together in the same cluster as the object Object importance evaluation means for evaluating Based on the object importance of objects, characterized by comprising an image importance degree evaluating means for evaluating the importance of the one image.
- a cluster is a unit of classification when objects having similar object feature quantities are grouped together, and each cluster corresponds to a different feature quantity range.
- the image management apparatus having the above configuration includes, in the image, an object importance level corresponding to the importance level of the person's face, which is an object included in the image, if the predetermined reference defines the feature amount of the person's face.
- Embodiment 1 It is an example of image importance. It is an example of a ranking result.
- movement of Embodiment 1 is shown.
- the co-occurrence information generation process is shown.
- the accuracy calculation process is shown. It is an example of certainty. It is an example of a support degree.
- transformation image management apparatus. It is a functional block diagram of an object part. This is an example of detecting a co-occurrence object in an image. It is an example of object appearance information. It is an example of object cluster classification information. It is an example of the co-occurrence information with respect to an object cluster. 10 is a flowchart illustrating the operation of the second embodiment. The co-occurrence information generation process of the second embodiment is shown. The accuracy calculation process of Embodiment 2 is shown.
- FIG. 10 is a functional configuration diagram of a modified image management apparatus 3500 according to a third embodiment. It is an example of reliability information. 10 is a flowchart showing the operation of the third embodiment. The reliability calculation process is shown. 10 illustrates accuracy calculation processing according to the third embodiment.
- FIG. 1 is a system configuration diagram illustrating an example of an image management system 10 including an image management apparatus 100 according to an embodiment of the present invention and an apparatus related thereto.
- the image management device 100 is connected to the photographing device 110 and the display device 120. Further, the image management apparatus 100 can also receive a user operation from the controller 130.
- the photographing device 110 is a device that can photograph images and store the photographed images, for example, a digital camera.
- the stored image group is input to the image management apparatus 100 via a cable such as a Universal Serial Bus (USB) cable.
- USB Universal Serial Bus
- An image here is a collection of pixel value data.
- the image may be, for example, a still image such as a photograph or a moving image.
- a still image such as a photograph or a moving image.
- the image is a photograph and a still image.
- the display device 120 is a device that displays video output from the image management device 100 by being connected to the image management device 100 via a cable such as a High-Definition Multimedia Interface (HDMI) cable, for example, a digital television. It is.
- HDMI High-Definition Multimedia Interface
- the image management apparatus 100 receives an image group from the image capturing apparatus 110, ranks the input image group based on the image importance that is the importance of the image, and outputs the ranking result to the display apparatus 120.
- the image management apparatus 100 detects an object having a specific pattern from the image, and evaluates an image containing a large number of objects that are evaluated as having high importance for the user as an image having high image importance. For this reason, the user can easily select an image important for himself / herself by searching for an image in order from an image having a high image importance level to an image having a low image importance level.
- an object is an element detected in an image by a template possessed by the image management apparatus 100.
- the template has information for specifying an object.
- This template is data indicating a feature amount pattern related to a human face, for example. By using this template, a human face can be detected as an object.
- the image management apparatus 100 evaluates, for each object, an object importance level indicating how important the object is for the user.
- the object importance level is evaluated based on the number of objects that belong to the same cluster as the object to be evaluated based on the result of the image management apparatus 100 classifying the objects into clusters.
- a cluster is a unit of classification when objects having similar object feature quantities are grouped together. For example, assuming that an object is a person and there are multiple images of the same person, the object feature values of the person extracted from each image may not be exactly the same due to changes in the shooting state, etc. Many. However, since the same person is considered to have object feature amounts that are similar to each other even if they are not exactly the same, there is a high possibility that they belong to the same cluster. For this reason, if objects belonging to the same cluster are regarded as the same person, a large number of photographed persons have a large number of objects belonging to the same cluster, and the object importance is highly evaluated.
- Embodiment 1> the image management apparatus 100 detects a human face as an object, evaluates the importance of the person as the object importance, and evaluates and ranks the image importance of the image based on the object importance.
- the image management apparatus 100 according to the first embodiment of the present invention has a hardware configuration that includes a USB input terminal for inputting an image, an HDMI output terminal for outputting video, a memory for storing data and a program, and a processor for executing the program. Is provided.
- FIG. 2 is a block diagram showing the functional configuration of the image management apparatus 100 according to the first embodiment of the present invention, including related apparatuses.
- the image management apparatus 100 includes an image acquisition unit 201, an image storage unit 202, an object detection unit 203, a template storage unit 204, an object appearance information storage unit 205, an object feature amount storage unit 206, an object classification unit 207, a cluster.
- the image acquisition unit 201 has a function of acquiring a group of images stored in the photographing apparatus 110 through an input interface such as a USB input terminal.
- the image acquisition unit 201 assigns an image ID (IDentifier) 301 to each image in the acquired image group, and stores the image ID 301 and the image data 302 in the image storage unit 202 in association with each other as the image group 300.
- IDentifier image ID
- the image storage unit 202 has a function of storing the image ID 301 and the image data 302 of all the images included in the image group 300 acquired by the image acquisition unit 201.
- the image storage unit 202 is realized by a memory, for example.
- FIG. 3 shows an example of the image group 300 stored in the image storage unit 202. FIG. 3 will be described in detail later.
- the object detection unit 203 extracts a feature amount from each image of the image group 300 stored in the image storage unit 202, detects an object using a template stored in the template storage unit 204, and detects the detected object Has a function of assigning an object ID 402 for identifying each object.
- the function of the object detection unit 203 is realized, for example, when a processor executes a program stored in a memory.
- the feature amount will be described in detail later.
- FIG. 4 shows an example of detecting an object from an image. FIG. 4 will be described in detail later.
- the object detection unit 203 further associates the image ID 301 with the object ID 402 of the object detected in the image for each image and stores it in the object appearance information storage unit 205 as the object appearance information 500, and sets the object ID 402 for each detected object.
- the object feature quantity 601 which is the feature quantity of the object is associated and stored in the object feature quantity storage unit 206.
- the template storage unit 204 stores a template having information for detecting an object in the image by the object detection unit 203.
- the template storage unit 204 is realized by a memory, for example.
- the template is data indicating a feature amount pattern related to a human face, and the template storage unit 204 stores a template generated from learning data prepared in advance.
- the object appearance information storage unit 205 has a function of storing the object appearance information 500 for each image.
- the object appearance information storage unit 205 is realized by a memory, for example.
- FIG. 5 shows an example of the object appearance information 500 stored in the object appearance information storage unit 205. FIG. 5 will be described in detail later.
- the object feature amount storage unit 206 has a function of storing the object feature amount 601 of the object detected by the object detection unit 203 together with the object ID 402.
- the object feature amount storage unit 206 is realized by a memory, for example.
- FIG. 6 shows an example of the object feature amount of the object stored in the object feature amount storage unit 206. FIG. 6 will be described in detail later.
- the object classification unit 207 classifies the object into clusters based on the object feature amount of the object stored in the object feature amount storage unit 206 and the cluster feature amount 702 of the cluster stored in the cluster feature amount storage unit 208. It has a function. Further, it has a function of calculating the cluster feature quantity 702 of the cluster from the object feature quantity of the object classified into the cluster. A cluster ID 703 for identifying each cluster is assigned to the cluster, and the calculated cluster feature quantity 702 is stored in the cluster feature quantity storage unit 208 in association with the cluster ID 703 of the cluster, and each cluster ID 703 and its cluster are stored.
- the object ID 402 of the object classified into the class ID and the number of objects classified into the cluster are associated with each other and stored as cluster classification information 900 in the cluster classification information storage unit 209.
- the function of the object classification unit 207 is realized, for example, when a processor executes a program stored in a memory.
- the cluster feature amount storage unit 208 stores a cluster feature amount 702 possessed by the cluster in association with the cluster ID 703.
- the cluster feature amount storage unit 208 is realized by a memory, for example.
- the cluster feature quantity 702 stored in the cluster feature quantity storage unit 208 is updated by the object classification unit 207 as necessary.
- the cluster classification information storage unit 209 has a function of storing the cluster classification information 900 for each cluster.
- the cluster classification information storage unit 209 is realized by a memory, for example.
- FIG. 9 shows an example of cluster classification information 900 stored in the cluster classification information storage unit 209. FIG. 9 will be described in detail later.
- the co-occurrence information generation unit 210 uses the object appearance information 500 stored in the object appearance information storage unit 205 and the cluster classification information 900 stored in the cluster classification information storage unit 209 for each image in the image group 300. It has a function of generating co-occurrence information 1100 by detecting a co-occurrence relationship and a non-co-occurrence state.
- the function of the co-occurrence information generation unit 210 is realized, for example, when a processor executes a program stored in a memory.
- the co-occurrence information 1100 will be described in detail later.
- the similarity calculation unit 211 determines how similar the object feature amount of the object stored in the object feature amount storage unit 206 is with the cluster feature amount 702 of the cluster stored in the cluster feature amount storage unit 208. It has a function of calculating the similarity 1201 indicating whether or not.
- the function of the similarity calculation unit 211 is realized, for example, when a processor executes a program stored in a memory.
- the accuracy calculation unit 212 uses the similarity 130 calculated by the similarity calculation unit 211 and the co-occurrence information 1100 generated by the co-occurrence information generation unit 210 as the accuracy 1301 used for the calculation process of the evaluation value 1401 by the evaluation value calculation unit 213. 1201 and a function to calculate based on The function of the accuracy calculation unit 212 is realized, for example, when a processor executes a program stored in a memory.
- the evaluation value calculation unit 213 calculates an evaluation value 1401 for the cluster of objects from the accuracy 1301 calculated by the accuracy calculation unit 212 and the number of objects classified into clusters stored in the cluster classification information storage unit 209. Has a function to calculate.
- the function of the evaluation value calculation unit 213 is realized, for example, when a processor executes a program stored in a memory.
- the object importance level evaluation unit 214 has a function of evaluating the object importance level 1501 of an object based on the evaluation value 1401 calculated by the evaluation value calculation unit 213.
- the function of the object importance level evaluation unit 214 is realized, for example, when a processor executes a program stored in a memory.
- the image importance degree evaluation unit 215 performs image importance 1601 of an image based on the object appearance information 500 stored in the object appearance information storage unit 205 and the object importance degree 1501 stored in the object importance degree evaluation unit 214. It has a function to evaluate.
- the function of the image importance degree evaluation unit 215 is realized by, for example, a processor executing a program stored in a memory.
- the image ranking unit 216 has a function of ordering the image groups 300 based on the image importance 1601 evaluated by the image importance evaluation unit 215.
- the function of the image ranking unit 216 is realized, for example, when a processor executes a program stored in a memory.
- the image output unit 217 has a function of causing the display device 120 to display the image group 300 stored in the image storage unit 202 through an output interface such as an HDMI output terminal based on the order ordered by the image ranking unit 216. . Further, the display mode of the output image can be changed by the control signal received from the operation input unit 218. For example, when the number of images is large and all the images do not fit on the screen, the screen can be scrolled so as to display an image that is not displayed.
- the function of the image output unit 217 is realized, for example, when a processor executes a program stored in a memory.
- the operation input unit 218 has a function of receiving a user operation issued from the controller 130 by an infrared receiver or the like and transmitting a control signal corresponding to the operation to the image output unit 217.
- the function of the operation input unit 218 is realized, for example, when a processor executes a program stored in a memory.
- FIG. 3 is a diagram showing a data structure and example contents of the image group 300.
- the image group 300 includes an image ID 301 for identifying each image and image data 302.
- the image ID 301 is an identifier for uniquely identifying each image in the image management apparatus 100, and is assigned by the image acquisition unit 201 so as to correspond to the image data 302 on a one-to-one basis.
- the image ID 301 is generated by the image acquisition unit 201. For example, it is assumed that the image ID 301 is numbered from 1 in the order in which the image acquisition unit 201 has acquired images from the photographing apparatus 110, and an alphabet “I” is added to the head of the number.
- an image ID 302 of I001 is assigned to the image data 302a
- I002 is assigned to the image data 302b
- I003 is assigned to the image data 302c
- I004 is assigned to the image data 302d.
- the object appearance information 500 is information indicating which object is detected in which image.
- the object appearance information 500 is generated by the object detection unit 203, stored in the object appearance information storage unit 205, and used by the co-occurrence information generation unit 210, the accuracy calculation unit 212, and the image importance degree evaluation unit 215.
- FIG. 4 shows an example of an area 401 where the object detection unit 203 detects an object and an object ID 402 of an object detected in the area 401.
- FIG. 5 corresponds to the data configuration of the object appearance information 500 and FIG. It is an example of contents to be.
- the object appearance information 500 is represented for each image as a set of an image ID 301 and an object ID 402 for identifying each object of the object detected in the image. There may be one object included in the image, or there may be a plurality of objects or none.
- the object ID 402 is an identifier for uniquely identifying each object in the image management apparatus 100, and is given by the object detection unit 203 so as to correspond to the object on a one-to-one basis.
- the object ID 402 is generated by the object detection unit 203. For example, it is assumed that the object ID 402 is numbered from 1 in the order in which the object detection unit 203 detected the object, and an alphabet “O” is added to the head of the number.
- the object detected in the area 401a is O001
- the object detected in the area 401b is O002
- the object detected in the area 401c is O003
- the object detected in the area 401d is O004
- An object ID 402 of O006 is assigned to the object detected in the area 401e
- an object ID 402 of O006 is assigned to the object detected in the area 401f.
- an object identified by an object ID 402 of O001 is called an object O001.
- the object ID 402 of the object included in the specific image can be acquired, and conversely, the image ID 301 of the image including the specific object can also be acquired.
- the feature amount indicates a feature related to a distribution of pixel values related to a plurality of pixels in the image.
- the feature amount is a vector having a plurality of numerical values indicating image features as components.
- Image features include the periodicity and directionality of pixel value distribution of image data obtained using a Gabor filter.
- the periodicity and directionality of pixel value distribution For example, a distance between two points recognized as eyes from a point, a distance between a point recognized as a nose and a point recognized as a mouth can be used as a component.
- the object feature quantity 601 is a feature quantity detected as an object among the feature quantities extracted by the object detection unit 203, is generated by the object detection unit 203, and is stored in the object feature quantity storage unit 206 together with the object ID 402. Then, it is used in the object classification unit 207 and the similarity calculation unit 211.
- FIG. 6 shows a data configuration and example contents of the object feature quantity 601 stored in the object feature quantity storage unit 206.
- the object feature quantity 601 is composed of a plurality of feature quantity components including a feature quantity component 1, a feature quantity component 2, and a feature quantity component 3.
- the object feature quantity 601 of the object O001 stores the feature quantity component 1 as 90.3, the feature quantity component 2 as 98.4, and the feature quantity component 3 as 71.4.
- Cluster> The cluster ID 703, the cluster feature amount 702, and the cluster classification information 900 will be described as data regarding the cluster.
- FIG. 7 is a diagram illustrating an image in which the object classification unit 207 classifies objects into clusters.
- 601a, 601b, 601c, 601d, 601e, and 601f indicate the object feature quantity 601 of the object O001, object O002, object O003, object O004, object O005, and object O006, respectively.
- symbols and objects correspond to each other.
- the feature amount space 700 includes three clusters, a cluster 701a, a cluster 701b, and a cluster 701c, which are separated by a cluster boundary 704.
- the cluster ID 703 is an identifier for uniquely identifying each cluster in the image management apparatus 100, and is assigned by the object classification unit 207 so as to correspond to the cluster on a one-to-one basis.
- the cluster ID 703 is generated by the object classification unit 207.
- the cluster ID 703 is numbered from 1 in the order in which the object classification unit 207 generates clusters, and an alphabet “C” is added to the head of the number.
- a cluster ID 703 of C001 is assigned to the cluster 701a
- C002 is assigned to the cluster 701b
- C003 is assigned to the cluster 701c.
- cluster C001 the cluster identified by the cluster ID 703 of C001.
- the cluster feature amount 702 is a feature amount that the cluster has, and is a value that represents the object feature amount 601 of all objects included in the cluster.
- the cluster feature quantity 702 is stored in the cluster feature quantity storage unit 208, and is generated, discarded, and updated by the object classification unit 207 as necessary.
- FIG. 8 shows an example of the data structure of the cluster feature quantity 702 and data contents corresponding to the cluster of FIG.
- the data structure of the cluster feature quantity 702 is the same as that of the object feature quantity 601.
- the cluster feature amount 702 is calculated as an arithmetic average of the object feature amounts 601 of the objects included in the cluster, for example.
- the cluster feature quantity 702 of the cluster C001 stores feature quantity component 1 as 94.4, feature quantity component 2 as 90.2, and feature quantity component 3 as 79.8.
- the cluster classification information 900 is information indicating which object the object classification unit 207 has classified into which cluster.
- the cluster classification information 900 is generated by the object classification unit 207, stored in the cluster classification information storage unit 209, and used by the co-occurrence information generation unit 210 and the evaluation value calculation unit 213.
- FIG. 9 shows an example of the data structure of the cluster classification information 900 and data contents corresponding to the cluster of FIG.
- the cluster classification information 900 includes a cluster ID 703, an object ID 402 of each object belonging to the cluster, and a number 901 of objects belonging to the cluster for each cluster.
- an object O001, an object O003, and an object O006 having object feature amounts 601 of 601a, 601b, and 601c belong to the cluster C001 indicated by the reference numeral 701a, and FIG. 9 corresponding to FIG.
- the cluster classification information 900 of the cluster C001 stores information that the object O001, the object O003, and the object O006 belong, and the total number of objects 901 belonging to the cluster C001 is 30. ⁇ 2-2-5.
- Co-occurrence information> Here, first, co-occurrence and non-co-occurrence will be described, and then the co-occurrence information 1100 will be described.
- co-occurrence means that two events occur together.
- cluster A and cluster B co-occur, it is said that there is a co-occurrence relationship between cluster A and cluster B, especially when an event of “an object belonging to cluster A is included” occurs. , It is assumed that there is a co-occurrence relationship from cluster A to cluster B when the event “an object belonging to cluster B is included” occurs.
- Fig. 10 is a conceptual diagram of the co-occurrence relationship.
- a broken arrow 1001 indicates a co-occurrence relationship and connects objects included together in the same image.
- an arrow 1001 from the object a to the object b is attached.
- an arrow 1001 from the object b to the object a is also attached.
- an arrow 1001 is attached from object a belonging to cluster A to object b belonging to cluster B, there is a co-occurrence relationship from cluster A to cluster B.
- the image including the object O001 corresponding to 601a includes both the object O002 corresponding to 601b. It can be said that there is a relationship. At the same time, since the object O001 is included in the image including the object O002, it can be said that there is a co-occurrence relationship from the cluster C002 to the cluster C001.
- non-co-occurrence means not co-occurring, and here, in particular, indicates that a certain cluster does not co-occur with any cluster in one image. That is, here, non-co-occurrence refers to an event in which only one object belonging to a certain cluster is included in one image.
- the object O006 corresponding to 601f that is not connected by an arrow is included alone in the image, and there is no object included in the same image. Therefore, the object O006 is included in the image including the object O006. It can be said that the cluster C001 (701a) to which is belongs is a non-co-occurrence state.
- the co-occurrence information 1100 is information regarding co-occurrence between clusters, is generated by the co-occurrence information generation unit 210, and is used by the accuracy calculation unit 212.
- FIG. 11 shows a data structure and content example of the co-occurrence information 1100.
- the co-occurrence information 1100 includes a co-occurrence degree 1101 indicating the degree of co-occurrence relationship from which cluster to which cluster in all the images of the image group 300, and the non-co-occurrence state of which cluster is the entire image of the image group 300.
- the degree of non-co-occurrence 1102 indicating the degree of the occurrence of
- the co-occurrence degree 1101 is the number of times the co-occurrence relationship is detected in the image group 300.
- the co-occurrence degree 1101 of cluster A with respect to cluster B is the number of times that the co-occurrence relationship between cluster A and cluster B is detected in the image group 300.
- the non-co-occurrence degree 1102 is the number of times a non-co-occurrence state is detected in the image group 300.
- the non-co-occurrence degree 1102 of the cluster A is the number of times that the non-co-occurrence state of the cluster A is detected in the image group 300 and matches the number of images in which an object belonging to the cluster A is included alone.
- the co-occurrence degree 1101 for cluster C001 is 0, the co-occurrence degree 1101 for cluster C002 is 8, the co-occurrence degree 1101 for cluster C003 is 2, and the non-co-occurrence degree 1102 is 5. .
- the co-occurrence degree 1101 of the cluster C001 with respect to the cluster C001 is 0 because the image including the object belonging to the cluster C001 does not include the object belonging to the cluster C001 in addition to the object. Means.
- the co-occurrence degree 1101 of the cluster C001 with respect to the cluster C002 means that the number of times that the object belonging to the cluster C002 is included in the image including the object belonging to the cluster C001 is eight.
- the non-co-occurrence degree 1102 of the cluster C001 means that the number of times that no other object is included in the image including the object belonging to the cluster C001 is five. That is, the number of images including an object belonging to cluster C001 alone is five. ⁇ 2-2-6. Similarity>
- the similarity 1201 is a value indicating how close the object feature quantity 601 of an object and the cluster feature quantity 702 of a cluster are.
- the similarity 1201 is generated by the similarity calculator 211 and used by the accuracy calculator 212.
- FIG. 12 shows the data structure and content example of the similarity 1201.
- the similarity 1201 is a numerical value for a combination of the object ID 402 of the object and the cluster ID 703 of the cluster.
- the similarity 1201 is, for example, a numerical value represented by the inner product of a vector indicating the object feature amount 601 of the object and a vector indicating the cluster feature amount 702 of the object in the feature amount space 700, or a cluster of the object feature amount 601 of the object and the cluster.
- a numerical value calculated from a difference from the feature amount 702 can be used.
- the similarity 1201 is calculated from the difference between the object feature quantity 601 of the object and the cluster feature quantity 702 of the cluster.
- the similarity 1201 of the object O003 with respect to the cluster C001, the cluster C002, and the cluster C003 is 0.50, 0.46, and 0.42, respectively.
- the accuracy 1301 is a value indicating the height of association with the cluster of objects using not only the similarity 1201 but also the co-occurrence information 1100.
- the accuracy 1301 is generated by the accuracy calculation unit 212 and used by the evaluation value calculation unit 213. By using the accuracy 1301, it is possible to evaluate the related height with higher accuracy than judging the related height only from the similarity 1201.
- the accuracy 1301 of the object A with respect to the cluster B is calculated based on the similarity 1201 of the object A with respect to the cluster B and the co-occurrence information 1100 of the cluster B.
- the calculation method will be described in detail later.
- FIG. 13 shows an example of the data structure and contents of accuracy 1301.
- the accuracy 1301 is a numerical value for a combination of the object ID 402 of the object and the cluster ID 703 of the cluster.
- the accuracy 1301 of the object O003 with respect to the cluster C001, cluster C002, and cluster C003 is 0.46, 0.53, and 0.39, respectively.
- the evaluation value 1401 is an importance calculated for a combination of an object and a cluster, and an object importance 1501 described later is evaluated based on the evaluation value 1401.
- the evaluation value 1401 is generated by the evaluation value calculation unit 213 and used by the object importance level evaluation unit 214.
- the evaluation value 1401 of the object A with respect to the cluster B is calculated as the product of the accuracy 1301 of the object A with respect to the cluster B and the number 901 of objects belonging to the cluster B.
- FIG. 14 shows a data configuration and example contents of the evaluation value 1401.
- the evaluation value 1401 is constituted by a numerical value for a combination of the object ID 402 of the object and the cluster ID 703 of the cluster.
- the evaluation values 1401 for the cluster C001, cluster C002, and cluster C003 of the object O003 are 13.6, 14.2, and 7.77, respectively.
- the evaluation value 1401 for the cluster C001 of the object O003 is 13.6 as a product of 0.46 which is the accuracy 1301 of the object O003 with respect to the cluster C001 and 30 which is the number 901 of objects belonging to the cluster C001. It has become. ⁇ 2-2-8.
- the object importance 1501 is an importance evaluated for each object, and an image importance 1601 described later is evaluated based on the object importance 1501 of the object.
- the object importance level 1501 is generated by the object importance level evaluation unit 214 and used by the image importance level evaluation unit 215.
- the object importance 1501 of an object is evaluated as the sum of evaluation values 1401 for each cluster of the object.
- FIG. 15 shows an example of the data structure and contents of the object importance 1501.
- the object importance 1501 is a numerical value for the object ID 402 of the object.
- the object importance levels 1501 of the objects O001, O002, and O003 are 40.3, 25.6, and 38.1, respectively.
- the object O001 has an evaluation value 1401 of 13.6 for the cluster C001 of the object O001, 14.2 of the evaluation value 1401 for the cluster C002, 7.77 of the evaluation value 1401 for the cluster C003,
- the total evaluation value 1401 for the cluster is added to 40.3.
- the image importance 1601 is an importance evaluated for each image, and the image management apparatus 100 ranks the image group 300 based on the image importance 1601 of each image.
- the image importance 1601 is generated by the image importance evaluation unit 215 and used by the image ranking unit 216.
- the image importance 1601 of an image is evaluated as the sum of the object importance 1501 of all objects included in the image.
- FIG. 16 shows the data structure and content example of the image importance 1601.
- the image importance 1601 includes a numerical value for the image ID 301 of the image.
- the image importance 1601 of the image I001, the image I002, the image I003, and the image I004 are 65.9, 89.4, 28.8, and 0, respectively.
- the image importance 1601 of the image I001 is 65.9, which is obtained by adding 40.3 of the object importance 1501 of the object O001 and 25.6 of the object importance 1501 of the object O002. Yes.
- FIG. 1 An example of the result of arranging the images of the image group 300 based on the image importance 1601 is shown in FIG.
- the rank 1701 of the image I017 having the image importance 1601 of 128 is the first, and the image I002, the image I001, and the image I072 follow the second rank.
- the operation of the image management apparatus 100 according to the present invention will be described.
- the image acquisition unit 201 acquires an image group stored in the photographing apparatus 110. Then, each image data 302 of the acquired image group 300 is stored in the image storage unit 202 together with an image ID 301 for identifying each image (S1801).
- the object detection unit 203 extracts the object feature quantity 601 in each image stored in the image storage unit 202, and detects an object (S1802).
- the object detection unit 203 generates object appearance information 500 for each image and stores it in the object appearance information storage unit 205.
- the object feature quantity 601 of the detected object is stored in the object feature quantity storage unit 206 in association with the object ID 402. The object detection process will be described in detail later.
- the object classification unit 207 classifies all objects detected by the object detection unit 203 into clusters based on the object feature amount 601 of each object stored in the object feature amount storage unit 206 (S1803).
- the object classification unit 207 also calculates a cluster feature amount 702 representing the cluster.
- Cluster classification information 900 which is a result of the classification, is stored in the cluster classification information storage unit 209.
- the calculated cluster feature quantity 702 is stored in the cluster feature quantity storage unit 208. The object classification process will be described in detail later.
- the similarity calculation unit 211 calculates each object from the object feature amount 601 of each object stored in the object feature amount storage unit 206 and the cluster feature amount 702 stored in the cluster classification information storage unit 209.
- the similarity 1201 with each cluster is calculated (S1804 to S1806).
- the co-occurrence information generation unit 210 uses the object appearance information 500 stored in the object appearance information storage unit 205 and the cluster classification information 900 stored in the cluster classification information storage unit 209 in the image group 300.
- the co-occurrence relationship and the non-co-occurrence state are detected to generate co-occurrence information 1100 for all clusters (S1807).
- the generation process of the co-occurrence information 1100 will be described in detail later.
- the accuracy calculation unit 212 calculates the accuracy 1301 for each cluster of each object based on the similarity 1201 calculated by the similarity calculation unit 211 and the co-occurrence information 1100 generated by the co-occurrence information generation unit 210.
- the evaluation value calculation unit 213 calculates an evaluation value 1401 for each cluster of each object based on the accuracy 1301 calculated by the accuracy calculation unit 212 and the cluster classification information 900 stored in the cluster classification information storage unit 209. (S1808 to S1809).
- the calculation process of the accuracy 1301 and the calculation process of the evaluation value 1401 will be described in detail later.
- the object importance level evaluation unit 214 evaluates the object importance level 1501 of each object based on the evaluation value 1401 calculated by the evaluation value calculation unit 213 (S1810 to S1811).
- the object importance 1501 is evaluated as the sum of all evaluation values 1401 for each cluster of the object to be evaluated.
- the image importance evaluation unit 215 determines the image importance of each image based on the object importance 1501 evaluated by the object importance evaluation unit 214 and the object appearance information 500 stored in the object appearance information storage unit 205.
- the degree 1601 is evaluated (S1812 to S1813).
- the image importance 1601 of the image is the sum of the object importances 1501 of all the objects included in the image. If no object is included, the image importance 1601 of the image is set to 0.
- the image ranking unit 216 ranks the image group 300 based on the image importance 1601 evaluated by the image importance evaluation unit 215 (S1814).
- the images are arranged in descending order of the numerical value of the image importance 1601.
- the image output unit 217 outputs the result of ranking by the image ranking unit 216 (S1815). Based on the order rearranged by the image ranking unit 216 and the operation received by the operation input unit 218, the image group 300 stored in the image storage unit 202 is arranged and output to the display device 120. ⁇ 2-3-2. Object detection process> Here, an object detection process (S1802) performed by the object detection unit 203 will be described.
- the object detection unit 203 first extracts a feature amount from the target image for detecting the object.
- a method for extracting feature amounts from an image there is a method for extracting feature amounts such as periodicity and directionality of pixel value distribution of image data using a Gabor filter.
- the object detection unit 203 compares the extracted feature amount with the template stored in the template storage unit 204 to detect an object.
- an object is detected when the extracted feature amount matches the feature amount pattern of the template.
- the objects detected by the object detection unit 203 are numbered from 1 in the order in which the objects are detected, and an object ID 402 with an alphabetic “O” added to the beginning of the number is assigned.
- the object detection unit 203 stores the set of the image ID 301 of the image as the object detection target and the object IDs 402 of all the objects detected in the image as the object appearance information 500 in the object appearance information storage unit 205. Further, the feature quantity extracted from the area 401 where the object is detected is stored in the object feature quantity storage unit 206 in association with the object ID 402 as the object feature quantity 601.
- FIG. 4 shows an example of detecting an object in an image.
- object O001 and object O002 are detected in image data 302a
- object O003 and object O004 and object O005 are detected in image 302b
- object O006 is detected in image 302c
- no object is detected in image 302d.
- the object detection unit 203 extracts feature amounts from the image data 302a, and the feature amounts extracted from the regions 401a and 401b in the image I001 corresponding to the image data 302a are stored in the template storage unit 204. Therefore, the object is detected from the area 401a and the area 401b.
- the object detection unit 203 assigns object IDs 402 of O001 and O002 to the objects detected from the areas 401a and 401b.
- the object detection unit 203 stores the object appearance information 500 in the object appearance information storage unit 205 as shown in FIG. Further, the object feature quantity 601 is stored in the object feature quantity storage unit 206 as shown in FIG. ⁇ 2-3-3.
- Object classification process> Here, the object classification processing (S1803) performed by the object classification unit 207 will be described.
- the object classification unit 207 classifies all the objects detected by the object detection unit 203 into clusters based on the object feature quantity 601 of each object stored in the object feature quantity storage unit 206.
- the K-means method is a classification method for automatically generating clusters and classifying objects.
- a cluster feature quantity 702 representing a cluster is automatically calculated, and each object is classified into a cluster having a cluster feature quantity 702 closest to the object feature quantity 601 of the object.
- Fig. 7 shows an image of classification by the K-means method.
- Reference numerals 601a to 601i denote positions in the feature amount space 700 of the object feature amounts 601 of the corresponding objects.
- 701a to 701c are clusters generated by the K-means method, and have cluster feature amounts 702 indicated by positions 702a to 702c, respectively.
- the object classification unit 207 classifies these three objects into the cluster 701a. Similarly, the object O002, the object O004, and the object O007 are classified into the cluster 701b, and the object O005, the object O008, and the object O009 are classified into the cluster 701c.
- the object classification unit 207 assigns numbers 701a to 701c to the clusters generated by the K-means method in the order in which the clusters are generated, starting with 1 and adding the alphabet “C” to the beginning of the number. ID 703 is assigned.
- the cluster classification information 900 as a result of the classification is stored in the cluster classification information storage unit 209.
- FIG. 9 shows an example of cluster classification information 900 obtained as a result of classifying all objects into clusters.
- the cluster feature quantity 702 is calculated as an arithmetic average of the object feature quantities 601 of all objects belonging to the cluster.
- the calculated cluster feature quantity 702 is stored in the cluster feature quantity storage unit 208.
- FIG. 7 shows an example of the cluster feature amount 702. ⁇ 2-3-4.
- Co-occurrence information generation processing> Here, the co-occurrence information 1100 generation process (S1807) performed by the co-occurrence information generation unit 210 for the image group 300 will be described.
- the co-occurrence information generation unit 210 detects the co-occurrence relationship of clusters and the non-co-occurrence state in each image of the image group 300, and generates co-occurrence information 1100 for all clusters.
- the process of detecting the co-occurrence relation or the non-co-occurrence state for one image and updating the co-occurrence information 1100 is referred to as a co-occurrence relation detection process.
- FIG. 19 is a flowchart when the co-occurrence information generation unit 210 generates the co-occurrence information 1100, and shows details of step S1807. It is assumed that the co-occurrence degree 1101 and the non-co-occurrence degree 1102 are all initialized to 0 before the generation process of the co-occurrence information 1100 is started.
- the co-occurrence relationship detection processing for the image k After the co-occurrence relationship detection processing for the image k is completed, it is determined whether there is an image that has not yet been subjected to the co-occurrence relationship detection processing (S1902). If it exists, one of the images is set as the next image k, and the process returns to S1901. If it does not exist, the co-occurrence information generation unit 210 ends the process of generating the co-occurrence information 1100 for the image group 300.
- the co-occurrence relation detection process for image k is performed as follows.
- the number of objects included in the image k is checked from the object appearance information 500. At this time, the case is divided according to whether the number of objects is 1 or 2 (S1903).
- the cluster A to which the object a belongs is in a non-co-occurrence state in the image k. It can be said.
- the cluster A to which the object a belongs is acquired from the cluster classification information 900 stored in the cluster classification information storage unit 209 (S1904).
- the non-co-occurrence state is detected from the object a, and the non-co-occurrence degree 1102 of the cluster A is increased by 1 (S1905).
- the co-occurrence relationship detection process for the image k is terminated.
- the image I001 includes both the object O001 belonging to the cluster C001 and the object O002 belonging to the cluster C002. Therefore, in the image I001, it can be said that there are two co-occurrence relationships of the cluster C001 to the cluster C002 and the co-occurrence relationship of the cluster C002 to the cluster C001.
- the co-occurrence relationship detection process is performed as follows. However, here, when detecting the co-occurrence relationship between the cluster A to which the object a belongs and the cluster B to which the object b belongs, the object a is called the co-occurrence source object and the object b is called the co-occurrence destination object of the object a. .
- an object a that has not yet been used as a co-occurrence source object is selected from the objects included in the image k. Then, the cluster A to which the object a belongs is acquired from the cluster classification information 900 (S1906).
- an object b that is not used as a co-occurrence object of the object a other than the object a is selected from the objects included in the image k.
- the cluster B to which the object b belongs is acquired from the cluster classification information 900 (S1907).
- the co-occurrence relationship is detected from the co-occurrence source object a and the co-occurrence destination object b, and the co-occurrence degree 1101 of the cluster A with respect to the cluster B is increased by 1 (S1908).
- the object b has been used as a co-occurrence destination object of the object a.
- the evaluation value calculation unit 213 calculates an evaluation value 1401 for the cluster of objects. calculate.
- the evaluation value calculation unit 213 uses the evaluation value 1401 for the accuracy 1301 for the cluster I of the object j calculated by the accuracy calculation unit 212 and the cluster classification information storage. This is calculated by multiplying the number of objects 901 belonging to cluster I acquired from the unit 209.
- a method of calculating the accuracy 1301 of the object j with respect to the cluster I by the accuracy calculation unit 212 will be described later.
- Accuracy calculation process> Here, a calculation process of the accuracy 1301 performed by the accuracy calculation unit 212 will be described.
- the accuracy calculation unit 212 calculates the accuracy 1301 for the cluster of objects.
- FIG. 20 is a flowchart showing the operation of the accuracy calculation unit 212 when the accuracy 1301 for a certain cluster I of an object j included in the image k is obtained.
- the number of objects existing in the image k including the object j is checked from the object appearance information 500 stored in the object appearance information storage unit 205. Then, the case is classified according to whether the number of objects is 1 or 2 (S2001). If it is 1, the accuracy 1301 is calculated based on the non-co-occurrence degree 1102, and if it is 2 or more, the accuracy 1301 is calculated based on the co-occurrence degree 1101.
- the calculation process of the accuracy 1301 based on the non-co-occurrence degree 1102 is performed as follows.
- the accuracy calculation unit 212 uses the accuracy 1301 as the cluster of the object j calculated using the non-co-occurrence 1102 of the cluster I. It is calculated based on the certainty factor and the support degree for I, and the similarity 1201 for the cluster I of the object j calculated by the similarity calculation unit 211.
- the certainty factor and the support factor are a kind of index indicating the strength of the correlation between the condition m and the conclusion n in the data mining technology.
- the certainty factor indicates the rate at which the conclusion n occurs together when the condition m occurs.
- the degree of support represents the ratio of the number of occurrences of both condition m and conclusion n with respect to the whole. When the certainty value and the support value are both large, it can be said that there is a high possibility that the conclusion n occurs when the condition m occurs.
- the condition m is an event that an object belonging to the cluster I is included in the image
- a conclusion n is an object that the object belonging to the cluster I is non-co-occurring in the image. It is an event that is included in the state of origin.
- the certainty factor and the support factor calculated for one cluster I are referred to as non-co-occurrence confidence factor 2102 of cluster I and non-co-occurrence support factor 2202 of cluster I, respectively. . That is, when the accuracy 1301 is calculated based on the non-co-occurrence degree 1102, the certainty factor of the object j with respect to the cluster I is the non-co-occurrence certainty factor 2102 of the cluster I, and the support degree of the object j with respect to the cluster I is the cluster I.
- FIG. 21 shows the data structure and content example of the certainty factor 2100.
- the certainty factor 2100 includes a later-described co-occurrence certainty factor 2101 and the non-co-occurrence certainty factor 2102 described above.
- the non-co-occurrence certainty 2102 of the cluster C001 is 0.17. This means that when an event that an object belonging to the cluster C001 is included in the image occurs, an event that the object is included in the image in a non-co-occurrence state occurs at a rate of 17%. .
- Fig. 22 shows the data structure and content example of the support level 2200.
- the support degree 2200 includes a co-occurrence support degree 2201 described later and the non-co-occurrence support degree 2202 described above.
- the non-co-occurrence support degree 2202 of the cluster C001 is 0.03. This means that when one object is selected from all objects, the event that the object belongs to the cluster C001 and is included in the image in a non-co-occurrence state occurs at a rate of 3%. means.
- the non-co-occurrence certainty 2102 and the non-co-occurrence support degree 2202 for the cluster I of the object j are calculated using the non-co-occurrence degree 1102 of the cluster I (S2008).
- the non-co-occurrence certainty 2102 is calculated as a ratio of the non-co-occurrence degree 1102 of the cluster I to the number 901 of objects belonging to the cluster I.
- the non-co-occurrence support degree 2202 is calculated as a ratio of the non-co-occurrence degree 1102 of the cluster I to the total number of objects.
- the non-co-occurrence certainty factor 2102 and non-co-occurrence support degree 2202 of the cluster I calculated in this way, and the similarity 1201 of the object j with respect to the cluster I calculated by the similarity calculation unit 211 are used as a formula for calculating the accuracy 1301.
- the accuracy 1301 is calculated by substitution (S2009).
- the formula for calculating the accuracy 1301 is a logistic regression formula in which a coefficient is determined by performing in advance a logistic regression analysis based on statistics when an object exists alone in an image.
- Logistic regression analysis like multiple regression analysis, predicts the objective variable for an arbitrary explanatory variable by deriving the relationship between the explanatory variable and the objective variable in advance using learning data.
- the explanatory variable corresponds to the similarity 1201, the certainty factor, and the support level of the object j with respect to the cluster I
- the objective variable corresponds to the probability 1301 that the object j is the cluster I.
- an explanatory variable having a greater influence on the calculation of the accuracy has a larger coefficient.
- the calculation process of the accuracy 1301 based on the co-occurrence degree 1101 is performed as follows.
- the accuracy calculation unit 212 sets the accuracy 1301 to 1 of the objects other than the object j existing in the image k in the cluster I.
- the certainty and support for the cluster I of the object j calculated using the co-occurrence 1101 for the cluster X to which the object x belongs, and the similarity 1201 for the cluster I of the object j calculated by the similarity calculation unit 211. Based on and.
- an object x that is not used in the calculation process of the accuracy 1301 for the cluster I of the object j is selected from objects other than the object j included in the image k (S2002).
- the cluster X to which the selected object x belongs is acquired from the cluster classification information 900 stored in the cluster classification information storage unit 209 (S2003). Then, from a co-occurrence degree 1101 of cluster I with respect to cluster X, a later-described co-occurrence certainty degree 2101 with respect to cluster X of cluster I and a later-described co-occurrence support degree 2201 with respect to cluster X of cluster I are calculated (S2004).
- the co-occurrence certainty 2101 of cluster I with respect to cluster X is calculated as a ratio of the co-occurrence degree 1101 with respect to cluster X of cluster I to the number 901 of objects belonging to cluster X.
- the co-occurrence support degree 2201 of the cluster I for the cluster X is calculated as a ratio of the co-occurrence degree 1101 for the cluster I of the cluster I to the total number of objects.
- the co-occurrence certainty 2101 and co-occurrence support 2201 of the cluster I calculated in this way for the cluster X to which the object x belongs, the co-occurrence certainty for the object x for the cluster I of the object j, and the object for the cluster I of the object j Let it be the co-occurrence support for x.
- the process returns to S2002. If not, the co-occurrence confidence and co-occurrence support for the object having the highest co-occurrence support among the objects other than the object j included in the image k are set as the object. Let j be the certainty and support for cluster I (S2006).
- the accuracy 1301 is calculated by substituting the certainty factor and the support factor calculated in this way and the similarity 1201 of the object j with respect to the cluster I calculated by the similarity calculation unit 211 into the formula for calculating the accuracy 1301 (S2007).
- the formula for calculating the accuracy 1301 is a logistic regression equation in which a coefficient is determined by performing a logistic regression analysis based on statistics when there are a plurality of objects in the image, and for calculating the accuracy, Explanatory variables with greater influence have larger coefficients.
- co-occurrence certainty 2101 and co-occurrence support degree 2201 of cluster I for cluster X are based on the condition that the object belonging to cluster X is included in the image, and that the object belonging to cluster I is included in the same image. Of confidence and support.
- the co-occurrence certainty 2101 of the cluster C001 with respect to the cluster C002 is 0.30. This means that when an event that an object belonging to the cluster C002 is included in the image occurs, an event that an object that belongs to the cluster C001 is included in the same image occurs at a rate of 30%.
- the co-occurrence support degree 2201 for the cluster C001 with respect to the cluster C002 is 0.04. This is because when one object is selected from all objects, the event that the object belongs to the cluster C002 and the object belonging to the cluster C001 is included in the image including the object is 4%. It means that it occurs at a rate. ⁇ 2-4. Effect of Embodiment 1>
- the image management apparatus 100 according to the first embodiment sets the object importance level corresponding to the importance level of a person's face, which is an object included in an image, of objects belonging to a cluster indicating that the person is the same person as the face person included in the image.
- the object importance is evaluated using the number of objects belonging to the surrounding clusters having high similarity to the object, so the object is correctly It can be evaluated that the object importance is closer when it is determined that the person is the same person.
- the object is calculated based only on the similarity of the feature amount and the possibility of the same person as the person with the highest number of appearances is evaluated to be low, the object importance is evaluated low even if the object is the same person.
- the possibility of being the same person is evaluated using not only the similarity of the feature quantity but also the co-occurrence relationship between the persons, the object can be another person only with the feature quantity. Even in the case where the property is high, it can be evaluated that the object importance is closer when the object is correctly determined to be the same person. ⁇ 3.
- Embodiment 2 of the present invention where the accuracy 1301 is calculated using the co-occurrence relationship between clusters to which the human face object belongs in Embodiment 1, the co-occurrence of the human face and an object other than the person is used.
- the modified image management apparatus 2300 that has been changed to a method for calculating the accuracy 1301 using the relationship will be described.
- the object refers to a predetermined object other than a human face detected in an object part to be described later, and hereinafter referred to as a “co-occurrence object” in order to distinguish it from an “object” having a general meaning.
- Co-occurring objects are, for example, cars, animals, plants, buildings, and the like.
- the co-occurrence object is used only for calculating the accuracy 1301 using the co-occurrence information, and the importance of the object is not considered.
- the hardware configuration of the modified image management apparatus 2300 is the same as that of the image management apparatus 100 according to the first embodiment.
- FIG. 23 is a diagram showing a functional configuration of the modified image management apparatus 2300 as a whole. However, description of peripheral devices is omitted, and functional blocks having functions equivalent to those of the image management device 100 are assigned the same reference numerals as in FIG.
- the deformed image management device 2300 adds an object unit 2301 for detecting and classifying an object to the image management device 100, and the co-occurrence information generation unit 210 and the accuracy calculation unit 212 are changed to a co-occurrence information generation unit 210a and an accuracy calculation unit, respectively. It has been changed to 212a.
- a portion corresponding to a difference from the image management apparatus 100 will be described.
- FIG. 24 is a block diagram showing the object part 2301 in detail.
- the object unit 2301 includes an object detection unit 2401, an object appearance information storage unit 2402, an object classification unit 2403, and an object cluster classification information storage unit 2404.
- the object detection unit 2401 extracts an object feature amount that is a feature amount of a co-occurrence object in each image of the image group 300 stored in the image storage unit 202, and detects a co-occurrence object based on a predetermined condition. And a function of assigning an object ID 2502 for identifying each co-occurrence object to the detected co-occurrence object.
- the function of the object detection unit 2401 is realized, for example, when a processor executes a program stored in a memory.
- FIG. 25 shows an example of detecting an object and a co-occurrence object from an image. FIG. 25 will be described in detail later.
- the object detection unit 2401 further associates the image ID 301 with the object ID 2502 of the co-occurrence object detected in the image for each image, and stores the object appearance information 2600 in the object appearance information storage unit 2402.
- the object appearance information storage unit 2402 has a function of storing object appearance information 2600 for each image.
- the object appearance information storage unit 2402 is realized by a memory, for example.
- FIG. 26 shows an example of object appearance information 2600 stored in the object appearance information storage unit 2402. FIG. 26 will be described in detail later.
- the object classification unit 2403 has a function of classifying an object detected by the object detection unit 2401 into an object cluster based on the object feature amount extracted by the object detection unit 2401.
- the function of the object classification unit 2403 is realized, for example, when a processor executes a program stored in a memory.
- the object classifying unit 2403 assigns an object cluster ID 2701 for identifying each object cluster to the object cluster, and assigns an object cluster ID 2701 for each object cluster, an object ID 2502 of the co-occurrence object classified into the object cluster, and the object cluster.
- the number of classified objects is associated and stored in the object cluster classification information storage unit 2404 as object cluster classification information 2700.
- the object cluster is a unit of classification when classifying co-occurrence objects based on a predetermined standard, and each object cluster corresponds to a range of object feature values different from each other.
- the object cluster classification information storage unit 2404 has a function of storing object cluster classification information 2700 for each object cluster.
- the object cluster classification information storage unit 2404 is realized by a memory, for example.
- FIG. 27 shows an example of object cluster classification information 2700 stored in the object cluster classification information storage unit 2404.
- FIG. 27 will be described in detail later.
- Object appearance information> The object appearance information 2600 is information indicating which co-occurrence object is detected in which image.
- the object appearance information 2600 is generated by the object detection unit 2401, stored in the object appearance information storage unit 2402, and used by the co-occurrence information generation unit 210a and the accuracy calculation unit 212a.
- FIG. 25 shows an example of an area 2501 where the object detection unit 2401 detects a co-occurrence object and an object ID 2502 of the co-occurrence object detected in the area 2501.
- FIG. 26 shows the data structure of the object appearance information 2600 and It is an example of the content corresponding to FIG.
- the object appearance information 2600 is represented for each image by a set of an image ID 301 and an object ID for identifying each co-occurrence object of the co-occurrence object detected in the image. There may be one co-occurrence object included in the image, there may be a plurality of co-occurring objects, or there may be no one.
- the object ID 2502 is an identifier for uniquely identifying each co-occurrence object in the deformed image management apparatus 2300, and is assigned by the object detection unit 2401 so as to correspond to the co-occurrence object on a one-to-one basis.
- the object ID 2502 is generated by the object detection unit 2401. For example, the object ID 2502 is numbered from 1 in the order in which the object detection unit 2401 detects the object, and an alphabet “B” is added to the head of the number.
- B001 is assigned to the co-occurrence object detected in the area 2501a
- B002 is assigned to the co-occurrence object detected in the area 2501b
- the object ID 2502 of B003 is assigned to the co-occurrence object detected in the area 2501c. Has been.
- the co-occurrence object shown in FIG. 25 is described as a specific example, the co-occurrence object is referred to as an object ID 2502.
- the co-occurrence object identified by the object ID 2502 of B001 is referred to as a co-occurrence object B001.
- the object ID 2502 of the co-occurrence object included in the specific image can be acquired, and conversely, the image ID 301 of the image including the specific co-occurrence object can also be acquired.
- the object feature amount is a feature amount related to an object.
- the object feature amount is a vector having a plurality of numerical values indicating image features as components.
- a feature amount related to an automobile when pixels having pixel values that are recognized as wheels (for example, a pixel value indicating black) are arranged in a circle, the diameter of the circle, the position of the center point, etc. It can be set as the feature amount of the wheel.
- a vector including a wheel feature value, a window feature value, or the like as a component can be used as an object feature value of an automobile.
- the object feature amount is generated by the object detection unit 2401 and used by the object classification unit 2403. ⁇ 3-2-3.
- the object cluster ID 2701 is an identifier for uniquely identifying each object cluster in the deformed image management apparatus 2300, and is assigned by the object classification unit 2403 so as to correspond to the object cluster on a one-to-one basis.
- the object cluster ID 2701 is generated by the object classification unit 2403. For example, it is assumed that the object cluster ID 2701 is numbered from 1 in the order in which the object classification unit 2403 generates clusters, and the alphabet “BC” is added to the head of the number.
- the object cluster classification information 2700 is information indicating which co-occurrence object is classified into which object cluster by the object classification unit 2403.
- the object cluster classification information 2700 is generated by the object classification unit 2403, stored in the object cluster classification information storage unit 2404, and used by the co-occurrence information generation unit 210a and the evaluation value calculation unit 213a.
- FIG. 27 shows an example of the data structure of the object cluster classification information 2700 and the contents of the object cluster classification information 2700 as a result of classifying the co-occurrence objects in FIG.
- the object cluster classification information 2700 includes, for each object cluster, a set of an object cluster ID 2701, an object ID 2502 of each co-occurrence object belonging to the object cluster, and the number of co-occurrence objects 2702 belonging to the object cluster.
- the object cluster shown in FIG. 27 is described as a specific example, the object cluster is referred to as an object cluster ID 2701.
- the object cluster identified by the BC001 object cluster ID 2701 is referred to as an object cluster BC001.
- the object cluster classification information 2700 of the object cluster BC001 stores information that the object B001 and the object B003 belong, and the total number of objects 2702 belonging to the object cluster BC001 is 21. . ⁇ 3-2-4.
- Co-occurrence information 2800 in the second embodiment is information indicating a relationship between a cluster and an object cluster.
- the cluster A and the object cluster B co-occur, it is said that there is a co-occurrence relationship between the cluster A and the object cluster B.
- an event that “an object belonging to the cluster A is included” occurs.
- the event “a co-occurrence object belonging to the object cluster B is included” occurs, there is a co-occurrence relationship from the cluster A to the object cluster B.
- the co-occurrence information 2800 is information relating to the co-occurrence relationship of the cluster to the object cluster, and is generated by the co-occurrence information generation unit 210a and used by the accuracy calculation unit 212a.
- FIG. 28 shows an example of the data structure and contents of the co-occurrence information 2800.
- the co-occurrence information 2800 is configured by a co-occurrence degree 2801 indicating how much the co-occurrence relationship from which cluster to which object cluster exists in all the images of the image group 300.
- the co-occurrence degree 2801 is the number of times the co-occurrence relationship is detected in the image group 300.
- the co-occurrence degree 2801 of the cluster A with respect to the object cluster B is the number of times that the co-occurrence relationship between the cluster A and the object cluster B is detected in the image group 300.
- the co-occurrence degree 2801 of the cluster C001 for the object cluster BC001 is 0, the co-occurrence degree 2801 for the object cluster BC002 is 3, and the co-occurrence degree 2801 for the object cluster BC003 is 5.
- FIG. 29 is a flowchart showing the operation of the modified image management apparatus 2300.
- the same operations as those in the image management apparatus 100 are assigned the same reference numerals as those in FIG.
- the co-occurrence object detection process (S2901) and the co-occurrence object classification process (S2902) are added to the operation of the image management apparatus 100 after the object classification process (S1803).
- the contents of the co-occurrence information 2800 generation process (S1807) and the evaluation value 1401 calculation process (S1808) are changed (S1807a and S1808a, respectively).
- the object detection unit 2401 first extracts an object feature amount from the target image for detecting the co-occurrence object.
- a method for extracting an object feature quantity from an image is a method for extracting a feature quantity such as periodicity and directionality of a distribution of pixel values of image data using a Gabor filter in the same manner as a feature quantity extraction method in object detection processing. There is.
- the object detection unit 2401 compares the template of the object detection unit 2401 and detects a co-occurrence object.
- a co-occurrence object is detected when the extracted object feature amount matches the pattern of the object feature amount possessed by the template.
- the co-occurrence objects detected by the object detection unit 2401 are numbered from 1 in the order in which the co-occurrence objects are detected, and an object ID 2502 with an alphabet “B” added to the head of the number is assigned.
- the object detection unit 2401 stores, in the object appearance information storage unit 2402, as a set of object appearance information 2600, a set of the image ID 301 of the image that is the detection target of the co-occurrence object and the object IDs 2502 of all the co-occurrence objects detected in the image. To do.
- FIG. 25 shows an example of detecting an object in an image.
- the co-occurrence object B001 is detected in the image 302d
- the co-occurrence object B002 is detected in the image 302e
- the co-occurrence object B003 is detected in the image 302f
- the co-occurrence object is not detected in the image 302c.
- the object detection unit 2401 extracts the object feature amount from the image data 302d, and the object feature amount extracted from the region 2501a in the image I004 corresponding to the image data 302d satisfies the standard defined by the template.
- the co-occurrence object is detected from the area 2501a.
- the object detection unit 2401 assigns an object ID 2502 of B001 to the co-occurrence object detected from the area 2501a.
- the object detection unit 2401 stores the object appearance information 2600 in the object appearance information storage unit 2402 as shown in FIG. ⁇ 3-3-2.
- Co-occurrence object classification process> Here, the co-occurrence object classification process (S2602) performed by the object classification unit 2403 will be described.
- the object classification unit 2403 classifies all co-occurrence objects detected by the object detection unit 2401 into object clusters based on the object feature amounts of the co-occurrence objects extracted by the object detection unit 2401.
- SVM Support Vector Machine
- the object cluster classification information 2700 as a result of the classification is stored in the object cluster classification information storage unit 2404.
- An example of the object cluster classification information 2700 after classifying all co-occurrence objects into object clusters is shown in FIG. ⁇ 3-3-3.
- Co-occurrence information generation processing> Here, the co-occurrence information 2800 generation process (S1807a) performed by the co-occurrence information generation unit 210a for the image group 300 will be described.
- the co-occurrence information generation unit 210a detects the co-occurrence relationship of the clusters in the images of the image group 300 with respect to the object clusters, and generates co-occurrence information 2800 between all clusters and all object clusters.
- the process of detecting the co-occurrence relation for one image and updating the co-occurrence information 2800 is referred to as a co-occurrence relation detection process.
- FIG. 30 is a flowchart when the co-occurrence information generation unit 210a generates the co-occurrence information 2800, and shows details of step S1807a. It is assumed that the co-occurrence degree 2801 is initialized to 0 before starting the process of generating the co-occurrence information 2800.
- the co-occurrence relationship detection processing for the image k After the co-occurrence relationship detection processing for the image k is completed, it is determined whether there is an image that has not yet been subjected to the co-occurrence relationship detection processing (S3002). If it exists, one of the images is set as the next image k, and the process returns to S3001. If it does not exist, the co-occurrence information generation unit 210a ends the process of generating the co-occurrence information 2800 for the image group 300.
- the co-occurrence relation detection process for image k is performed as follows.
- the image 302f includes both the object detected from the area 401i (hereinafter referred to as object O009) and the object B003. Therefore, in the image 302f, it can be said that there is one co-occurrence relationship between the cluster to which the object O009 belongs and the object cluster to which the object B003 belongs.
- the co-occurrence relationship detection process is performed as follows. However, here, when detecting the co-occurrence relationship between the cluster A to which the object a belongs and the object cluster B to which the co-occurrence object b belongs, the object a is the co-occurrence source object, and the co-occurrence object b is the co-occurrence destination object of the object a. Let's say each.
- an object a that has not yet been used as a co-occurrence source object is selected from the objects included in the image k. Then, the cluster A to which the object a belongs is acquired from the cluster classification information 900 stored in the cluster classification information storage unit 209 (S3004).
- a co-occurrence object b that is not yet used as a co-occurrence destination object of the object a is selected from the co-occurrence objects included in the image k.
- the object cluster B to which the co-occurrence object b belongs is acquired (S3005).
- the co-occurrence relationship is detected from the co-occurrence source object a and the co-occurrence destination object b, and the co-occurrence degree 1101 of the cluster A with respect to the cluster B is increased by 1 (S3006).
- the co-occurrence object b has been used as the co-occurrence destination object of the object a.
- one of the co-occurrence objects is set as the next co-occurrence destination object b, and the process returns to S3005. If it does not exist, it is assumed that object a has been used as a co-occurrence source object.
- the evaluation value calculation unit 213 calculates an evaluation value 1401 for the cluster of objects in the same manner as in the first embodiment.
- the accuracy 1301 for the cluster of objects is the one calculated by the accuracy calculation unit 212a.
- the calculation process of the accuracy 1301 in the accuracy calculation unit 212a will be described below. ⁇ 3-3-5. Accuracy calculation process>
- a calculation process of the accuracy 1301 performed by the accuracy calculation unit 212a will be described.
- the accuracy calculation unit 212a calculates the accuracy 1301 for the cluster of objects.
- FIG. 31 is a flowchart showing the operation of the accuracy calculation unit 212a when determining the accuracy 1301 for a certain cluster I of an object j included in the image k.
- the object appearance information 2600 stored in the object appearance information storage unit 2402 determines whether or not a co-occurrence object is included in the image k (S3101). If not included, the accuracy 1301 for the cluster I of the object j is set to 0 (S3108). If included, the accuracy 1301 is calculated based on the co-occurrence degree 2801.
- the calculation process of the accuracy 1301 based on the co-occurrence degree 2801 is performed as follows.
- the accuracy calculation unit 212a sets the accuracy 1301 to one of the co-occurrence objects existing in the image k of the cluster I.
- the certainty and support for the cluster I of the object j calculated using the co-occurrence degree 2801 for the object cluster X to which the object x belongs, and the similarity 1201 for the cluster I of the object j calculated by the similarity calculation unit 211. Calculate based on
- a co-occurrence object x that is not used in the calculation process of the accuracy 1301 for the cluster I of the object j is selected from the co-occurrence objects included in the image k (S3102).
- the object cluster X to which the selected co-occurrence object x belongs is acquired from the object cluster classification information 2700 stored in the object cluster classification information storage unit 2404 (S3103). Then, from the co-occurrence degree 2801 of the cluster I with respect to the object cluster X, a later-described co-occurrence certainty factor 3201 for the object cluster X of the cluster I and a later-described co-occurrence support degree 3301 for the object cluster X of the cluster I are calculated (S3104). .
- the co-occurrence certainty 3201 of the cluster I with respect to the object cluster X is calculated as a ratio of the co-occurrence degree 2801 of the cluster I with respect to the cluster X to the number 2702 of objects belonging to the object cluster X.
- the co-occurrence support 3301 of the cluster I with respect to the object cluster X is calculated as a ratio of the co-occurrence degree 2801 of the cluster I with respect to the object cluster X to the sum of the number of all objects and the number of all co-occurrence objects.
- the co-occurrence confidence 3201 and the co-occurrence support 3301 of the cluster I calculated for the object cluster X to which the co-occurrence object x belongs are obtained.
- the accuracy 1301 is calculated by substituting the reliability 1201 calculated for the cluster I of the object j calculated by the similarity calculation unit 211 into the formula for calculating the accuracy 1301 (S3107).
- the formula for calculating the accuracy 1301 is a logistic regression equation in which a coefficient is determined by performing a logistic regression analysis based on statistics in the case where an object and a co-occurring object exist in the image.
- the explanatory variable that has a greater influence on the calculation has a larger coefficient.
- the co-occurrence confidence 3201 and the co-occurrence support degree 3301 of the cluster I for the object cluster X are such that the objects belonging to the cluster I are included in the same image with respect to the condition that the co-occurring objects belonging to the object cluster X are included in the image.
- the co-occurrence certainty 3201 of the cluster C001 with respect to the object cluster BC002 is 0.60. This means that when an event occurs that a co-occurrence object belonging to the object cluster BC002 is included in the image, an event that an object belonging to the cluster C001 is included in the same image occurs at a rate of 60%. To do.
- the co-occurrence support degree 3301 of the cluster C001 with respect to the object cluster BC002 is 0.008. This is because when one object or co-occurrence object is selected from all objects and all co-occurrence objects, the object or co-occurrence object is a co-occurrence object belonging to the object cluster BC002, and the co-occurrence object is This means that an event that both the objects belonging to the cluster C001 are included in the included image occurs at a rate of 0.8%. ⁇ 3-4. Effects of Embodiment 2> Similar to the image management apparatus 100, the modified image management apparatus 2300 according to the second embodiment can easily select an image in which an important person with high interest is shown from an enormous number of images.
- the image management apparatus using this method is obtained by adding the co-occurrence information generation unit 210 of the image management apparatus 100 of the first embodiment to the above-described modified image management apparatus 2300 and changing the operation of the accuracy calculation unit 212a.
- FIG. 34 shows a flowchart of processing in which the accuracy calculation unit 212a whose operation has been changed calculates the accuracy 1301 for the cluster I of the object j included in the image k.
- the number of objects existing in the image k including the object j is checked from the object appearance information 500 stored in the object appearance information storage unit 205. Then, the case is classified according to whether the number of objects is 1 or 2 (S3401).
- the accuracy 1301 is calculated based on the co-occurrence degree 1101 for the cluster to which the object included in cluster I belongs. This processing is the same as the processing in steps S2002 to S2007 in the first embodiment.
- the accuracy 1301 is calculated based on the co-occurrence degree 2801 of the cluster I with respect to the object cluster to which the co-occurrence objects included are included. This processing is the same as the processing in steps S3004 to S3008 in the second embodiment.
- the accuracy 1301 is calculated based on the non-co-occurrence degree 1102 of the cluster I. This processing is the same as the processing in steps S2008 to S2009 in the first embodiment.
- the image management apparatus evaluates the importance of the image using the co-occurrence relationship between the people as much as possible, and for the image that cannot use the co-occurrence relationship between the persons, The importance of the image is evaluated based on the co-occurrence relationship or the number of times a person is captured by one person.
- Embodiment 3> As the third embodiment of the present invention, the accuracy 1301 is calculated based on the similarity, the certainty, and the support in the first embodiment, and the accuracy 1301 is further calculated based on the cluster reliability.
- the modified image management apparatus 3500 that has been changed to will be described.
- the cluster reliability is the degree to which the object feature amount of each object belonging to the cluster is concentrated on the cluster feature amount of the cluster, that is, the deviation of the object feature amount of each object belonging to the cluster. Indicates how small is overall.
- the distance between the cluster feature amount 702a of the cluster C001 and the object feature amounts 601a, 603c, and 601f of the objects belonging to the cluster C001 is the magnitude of the difference in feature amount. Show. That is, the closer the distance between the cluster feature amount and the object feature amount of each object belonging to the cluster, the higher the degree of concentration of the object feature amount of each object with respect to the cluster feature amount, and the higher the reliability of the cluster.
- the hardware configuration of the modified image management apparatus 3500 is the same as that of the image management apparatus 100 of the first embodiment.
- FIG. 35 is a diagram showing a functional configuration of the deformed image management apparatus 3500 as a whole. However, description of peripheral devices is omitted, and functional blocks having functions equivalent to those of the image management device 100 are assigned the same reference numerals as in FIG.
- the modified image management apparatus 3500 adds a reliability calculation unit 3501 for calculating the reliability of the cluster to the image management apparatus 100, and changes the accuracy calculation unit 212 to the accuracy calculation unit 212b.
- a portion corresponding to a difference from the image management apparatus 100 will be described.
- the reliability calculation unit 3501 has a function of calculating the reliability of the cluster for all clusters. Details of the reliability calculation method will be described later.
- the accuracy calculation unit 212b adds the accuracy 1301 used for the calculation process of the evaluation value 1401 by the evaluation value calculation unit 213 to the co-occurrence information 1100 and the similarity 1201 used by the accuracy calculation unit 212, and further calculates the reliability.
- a function of calculating based on the reliability 3601 calculated by the unit 3501 is provided. ⁇ 4-2. Data> ⁇ 4-2-1.
- Reliability information> The reliability information 3600 is information indicating the reliability 3601 for each cluster. Generated and updated by the reliability calculation unit 3501, and used by the accuracy calculation unit 212b.
- FIG. 36 shows a data configuration and content example of the reliability information 3600.
- the reciprocal of a value obtained by dividing the difference between the cluster feature quantity of the cluster and the object feature quantity of each object belonging to the cluster by the number of objects belonging to the cluster is defined as the reliability of the cluster.
- each feature quantity is composed of a plurality of components, the sum of the squares of the differences between the feature quantity components of the cluster and the object is summed, and the square root of the value is calculated between the cluster feature quantity and the object feature quantity. Difference.
- FIG. 37 is a flowchart showing the operation of the modified image management apparatus 3500. However, the same operations as those in the image management apparatus 100 are assigned the same reference numerals as those in FIG.
- the operation of the deformed image management apparatus 3500 is the accuracy used for the calculation process of the evaluation value 1401 by adding a calculation process (S3701) of the reliability 3601 after the object classification process (S1803) with respect to the operation of the image management apparatus 100.
- the content of the calculation process 1301 has been changed (S1808b).
- FIG. 38 shows a flowchart relating to the calculation process of the reliability 3601.
- the reliability calculation unit 3501 acquires the cluster feature amount of the cluster from the cluster feature amount storage unit 208 (step S3801), and focuses on one of the objects belonging to the cluster from the cluster classification information 900 (step S3802). Thereafter, the object feature amount of the object of interest is acquired from the object feature amount storage unit 206 (step S3803), and the difference between the acquired object feature amount and the cluster feature amount is calculated (step S3804).
- the difference between the feature amounts of the cluster C001 and the object O001 is the square of the difference of the feature amount component 1 (94.4-90.3) 2 and the square of the difference of the feature amount component 2 (90.2-98.4).
- step S3805 the square of the difference of the feature quantity component 3 (79.8-71.4) 2 is 12.43 which is the square root of the sum of the two .
- the processing from step S3801 to step S3805 is repeated until the difference between the object feature value of all the objects belonging to the cluster and the cluster feature value is calculated.
- step S3806 When the difference between the object feature amount and the cluster feature amount of all objects belonging to the cluster is calculated, the calculated differences are summed, and the value is divided by the number of objects belonging to the cluster (step S3806). The reciprocal of the obtained value is set as the reliability of the cluster (step S3807). The processing from step S3801 to step S3808 is repeated until the reliability is calculated for all the clusters registered in the cluster classification information 900. ⁇ 4-3-2. Accuracy calculation process> Here, a calculation process of the accuracy 1301 performed by the accuracy calculation unit 212b will be described.
- FIG. 39 shows a flowchart of the accuracy calculation process.
- step S2006 of the accuracy calculation processing shown in FIG. 20 the reliability of the cluster to which the object j belongs and the reliability of the cluster to which the object with the highest support level belongs are obtained (step S3901) is added, and the accuracy calculation processing performed in step S2007 is replaced with the accuracy calculation processing using the reliability acquired in step S3901 (step S3902) in addition to the similarity, reliability, and support. It is.
- the confidence level and the support level of the cluster having the highest support level among the objects co-occurring with the cluster to which the object j belongs are obtained. Selected (steps S2001 to S2006).
- the reliability of the cluster I to which the object j belongs and the reliability of the cluster with the highest support are acquired from the reliability information 3600 (step S3901).
- the accuracy 1301 is calculated by substituting the reliability of the cluster having a high value into the formula for calculating the accuracy 1301 (step S3902).
- the formula here is a logistic regression formula with coefficients determined by performing a logistic regression analysis based on statistics when there are multiple objects in the image, and has a large influence on the accuracy calculation.
- the explanatory variable has a larger coefficient.
- the certainty factor and the support factor are calculated based on the non-co-occurrence degree of the cluster I to which the object j belongs (step S2008).
- the reliability of the cluster I is acquired from the reliability information 3600 (step S3903).
- the certainty factor and the support factor calculated in step S2008, the similarity 1201 with respect to the cluster I of the object j calculated by the similarity calculation unit 211, and the reliability of the cluster I to which the object j acquired in step S3903 belongs are represented with an accuracy 1301.
- the accuracy 1301 is calculated by substituting it into the formula for calculating (step S3904).
- the formula here is a logistic regression formula with coefficients determined by performing a logistic regression analysis based on statistics when an object is present alone in the image, and has a large influence on the accuracy calculation.
- the explanatory variable has a larger coefficient.
- an evaluation value for the cluster is obtained from the accuracy belonging to the cluster and the number of objects belonging to the cluster, and the evaluation value for each cluster of the object Is the object importance of the object.
- the images are displayed in order of decreasing image importance, with the total object importance of the objects in the image as the image importance.
- the image management apparatus has been described as an example.
- the present invention is not limited to an apparatus that mainly manages images.
- it may be replaced with a storage device for storing still images or moving images such as a file server, a playback device for still images and moving images, a digital camera, a photographing device such as a camera-equipped mobile phone or a movie camera, and a personal computer (PC).
- PC personal computer
- any apparatus capable of managing images may be applied.
- the image acquisition unit 201 includes a USB input terminal and acquires an image group from the photographing apparatus 110 via a cable such as a USB cable. There is no need to acquire an image from the USB input terminal.
- the image group may be input by wireless communication, or may be input via a recording medium such as a memory card.
- the image group is input from the imaging device 110 to the image management device.
- the present invention is not limited to the imaging device, and any device can be used as long as the image group can be input to the image management device.
- an image group may be input from a file server storing images through a network. In short, it is sufficient that the image management apparatus can acquire the image group.
- the image acquisition unit 201 acquires an image group from the photographing apparatus 110 that is an external device.
- the image group may be acquired from an internal component of the image management apparatus.
- the image management apparatus itself may include an image storage unit such as a hard disk, and the image acquisition unit 201 may acquire an image group from the image storage unit.
- the image acquisition unit 201 can acquire the image group to be evaluated, it is not necessary to acquire all the image groups at once. For example, the image acquisition unit 201 may acquire one image or several images at a time and add an image to the image group 300 each time.
- all the image groups 300 acquired by the image acquisition unit 201 including the pixel values of the image data 302 are stored in the image storage unit 202.
- the image management apparatus performs processing, If the processing target image data 302 can be referred to, it is not always necessary to store all the image data 302 in the image storage unit 202.
- the image storage unit 202 stores only the image ID 301 of the image group 300 and the image data 302 of one image being processed, and is necessary for the object detection unit 203, the object detection unit 2401, and the image output unit 217.
- the acquired image data 302 may be acquired by the image acquisition unit 201 from an external device one by one. In short, it is only necessary for the apparatus to have access to all images when processing is performed using the image group 300.
- the image ID 301 generated by the image acquisition unit 201 is used to identify an image. However, if each image can be identified one by one, the image ID 301 is not necessarily the image acquisition unit 201. Does not need to be generated. For example, when an image is acquired as a file, the file name of the image may be used as the image ID 301. Further, the address of the top memory of the image data 302 when the image data 302 is stored in the memory may be used as the image ID 301.
- the object is a person's face and the template is data indicating a feature amount pattern related to the person's face.
- the present invention is not limited to the person's face.
- the object may be a pet animal and the template may be replaced with pattern data relating to the animal.
- an object may be detected as an object using a template related to an object such as an automobile or a building.
- a template related to an object such as an automobile or a building.
- the object classification unit 207 calculates the cluster feature quantity 702 of the cluster from the object feature quantity 601 of the object classified into the cluster, but it is not always necessary to calculate it.
- the cluster feature quantity 702 may be used as it is and may not be changed.
- the cluster only needs to have a cluster feature amount 702 for calculating the similarity 1201 to the cluster of the object.
- the cluster classification information storage unit 209 also stores the number of objects classified into each cluster. However, it is not always necessary to store the number of objects. For example, when using the number of objects belonging to a cluster, if the using side counts the objects belonging to the cluster each time, the cluster classification information storage unit 209 does not need to store the number of objects. In short, it is only necessary to obtain the number of objects classified into each cluster.
- the image importance level is obtained by adding all the object importance levels 1501 of the objects included in the image to highly evaluate an image including many important objects. It is not limited to. For example, the average object importance of the objects included in the image may be used, or the highest object importance may be selected and the value may be used as the image importance as it is. Further, evaluation may be performed by further weighting the ratio of the area of the object in the image. In short, the image importance may be calculated using the object importance of the object included in the image.
- the image importance is evaluated using only the object importance, but the present invention is not limited to this.
- the importance level may be evaluated with respect to the background, the shooting situation, and the like, and in addition to the object importance level, those importance levels may be used for evaluating the image importance level.
- other evaluation means may be further combined.
- the image group 300 is arranged in descending order from the image having the highest importance, and is output to the display device 120.
- the present invention is not limited to this.
- the image importance value may be added as image metadata and output in the same order as when the image group 300 was input. In short, image importance may be evaluated.
- the image output unit 217 includes an HDMI output terminal and outputs video from the image management apparatus to the display apparatus 120 via the HDMI cable. is not.
- an image may be output to the display device 120 via a DVI cable.
- the output target need not be limited to the display device, and the output content need not be limited to video.
- an image with high image importance may be printed by connecting to a printer.
- an image file to which an image importance value is added as image metadata may be recorded by connecting to an external storage device.
- the image management apparatus includes a memory for storing data.
- the present invention is not limited to this as long as it is a means for storing data.
- a hard disk or other data recording medium may be used.
- the logistic regression analysis is used to calculate the accuracy 1301, but the present invention is not limited to the logistic regression analysis. You may calculate by another method using similarity and co-occurrence information, or using similarity, co-occurrence information, and reliability.
- the sum of the accuracy 1301 for all clusters of one object is not necessarily 1, but the accuracy 1301 may be normalized so that the sum is 1.
- the accuracy is calculated using the co-occurrence information, but the co-occurrence information is not necessarily used.
- the accuracy may be calculated only from the similarity, or the accuracy may be calculated from the similarity and the reliability.
- the evaluation value may be calculated from the similarity and the number of objects belonging to the cluster without using the accuracy, or the number of objects belonging to the cluster to which the object belongs without using the similarity may be used as the object importance. good. In short, it is sufficient to use the number of objects belonging to the cluster to which the object belongs at least when evaluating the object importance.
- the evaluation value for an object cluster is calculated by multiplying the accuracy of the object with respect to the cluster and the number of objects belonging to the cluster.
- the present invention is not limited to this method.
- only the evaluation value for the cluster to which the object belongs may be multiplied by 2, for example, so as to be calculated by a method that places more importance on other clusters.
- the evaluation value may be calculated from the accuracy and the number of objects belonging to the cluster.
- Embodiments 1 to 3 the object importance level of a certain object is evaluated using the evaluation values for all clusters, but it is not always necessary to use the evaluation values for all clusters. For example, an evaluation value may be calculated only for a cluster having a similarity with the object equal to or higher than a predetermined value, and only the evaluation value may be used. In short, it is sufficient to use the number of objects belonging to the cluster to which the object belongs at least when evaluating the object importance.
- the method using the K-means method has been described as a method for classifying objects into clusters.
- the method is not limited to the K-means method as long as the objects can be classified into clusters.
- the cluster feature value 702 is automatically calculated by the K-means method, but the automatically calculated cluster feature value 702 is not necessarily used. For example, it may be a median of feature quantities of objects belonging to a cluster.
- the co-occurrence information for all clusters is generated, but the present invention is not limited to this.
- a modification in which co-occurrence between the same clusters is not generated as co-occurrence information can be considered.
- the co-occurrence degree 1101 of cluster A with respect to cluster B is known
- the co-occurrence degree 1101 of cluster B with respect to cluster A is also known, so it is sufficient to generate either one, so only one co-occurrence degree 1101 is obtained. It may be generated. For example, if there is a magnitude relationship between the cluster IDs, only the co-occurrence degree 1101 for a cluster with a large cluster ID of a cluster with a small cluster ID may be generated.
- the accuracy is calculated using the non-co-occurrence degree 1102 when only one object is included in one image.
- the accuracy may be calculated according to a standard. For example, the accuracy may be calculated using only the similarity, or the accuracy may be calculated using the similarity and the reliability.
- the importance of the co-occurrence object is not considered.
- the importance of the co-occurrence object is evaluated, and the image importance is evaluated based on the importance of the co-occurrence object. Also good.
- the number of co-occurrence objects belonging to the same object cluster as the co-occurrence object is used as the importance of the co-occurrence object, and the image importance of the image further adds the importance of the co-occurrence object to the sum of the object importance. It is good as a thing.
- the object importance may be used for evaluating the image importance of the image.
- the object cluster classification information storage unit 2404 also stores the number of co-occurrence objects classified into each object cluster. However, if the number of objects belonging to the object cluster can be acquired, it is not always necessary. There is no need to memorize the number of objects. For example, when using the number of objects belonging to an object cluster, if the user side counts the objects belonging to the object cluster each time, the object cluster classification information storage unit 2404 does not need to store the number of objects.
- the method of classifying co-occurrence objects into object clusters has been described as SVM.
- the method is not limited to SVM as long as the co-occurrence objects can be classified into object clusters.
- the K-means method described as a method for classifying objects may be used.
- the accuracy with respect to the cluster of the object included in the image is set to 0.
- the accuracy is not necessarily set to 0.
- the accuracy of the object with respect to the cluster may be calculated based only on the similarity.
- the co-occurrence object detection process and the co-occurrence object classification process are added after the object classification process. However, after the image is acquired, the co-occurrence information is generated. Anytime before you do. For example, a co-occurrence object detection process and a co-occurrence object classification process may be added immediately before the co-occurrence information generation process.
- the sum of the squares of the difference between the feature amount components of the cluster and the object is summed, and the square root of the value is calculated as the difference between the cluster feature amount of the cluster and the object feature amount of the object.
- the present invention is not limited to this.
- an arithmetic average of absolute values of differences between the feature amount components between the cluster and the object may be used as the feature amount difference.
- the sum of the difference between the cluster feature quantity of a cluster and the object feature quantity of each object belonging to the cluster is divided by the number of objects belonging to the cluster, and the reciprocal of the value is calculated as the reliability.
- the present invention is not limited to this.
- the variance, the standard deviation, or the like may be calculated from the cluster feature amount of the cluster and the object feature amount of each object belonging to the cluster, and the inverse thereof may be used as the reliability.
- the method of extracting the feature amount using the Gabor filter is described as an example.
- the feature amount may be extracted by any method as long as the feature amount of the image can be extracted.
- the present invention performs the image importance degree evaluation processing and the like shown in the first to third embodiments (see FIGS. 18 to 20, FIGS. 29 to 31, FIG. 34, and FIGS. 37 to 39) and the processor of the image management apparatus.
- a control program made up of program codes to be executed by various circuits connected to the processor can be recorded on a recording medium or distributed and distributed via various communication paths. Examples of such a recording medium include an IC card, a hard disk, an optical disk, a flexible disk, and a ROM.
- the distributed and distributed control program is used by being stored in a memory or the like that can be read by the processor, and the processor executes the control program to realize the functions shown in the embodiments. Become so.
- a part of the control program is transmitted to a device (processor) capable of executing a program separate from the image management apparatus via various networks, and the part of the control program is executed in the separate program-executable device. It is also possible to make it.
- Part or all of the constituent elements constituting the image management apparatus may be implemented as one or a plurality of integrated circuits (IC, LSI, etc.), and other elements are included in the constituent elements of the image management apparatus. It is also possible to add an integrated circuit (single chip).
- the image management apparatus corresponds to an object included in an image in an image acquisition unit that acquires an image and each image acquired by the image acquisition unit.
- An object detection unit that detects an object by extracting an object feature amount, which is a feature amount related to a distribution of pixel values of a plurality of pixels, based on a predetermined reference, and is detected in each image acquired by the image acquisition unit
- Based on the object classification means for classifying each object into one of a plurality of clusters according to the object feature amount of each object, and for each object based on the number of objects belonging to the same cluster as the object
- Object importance evaluation hand that evaluates object importance that is the importance of When provided on the basis of the object importance of objects contained in one image, and an image importance degree evaluating means for evaluating the importance of the one image.
- the image management apparatus includes, in the image, the object importance level corresponding to the importance level of the person's face, which is an object included in the image, if the predetermined criterion determines the feature amount of the person's face. It is calculated based on the number of appearances of objects belonging to the cluster indicating that it is the same person as the person of the face, and the importance of each image is calculated to reflect the importance of the object included in each image.
- the predetermined criterion determines the feature amount of the person's face. It is calculated based on the number of appearances of objects belonging to the cluster indicating that it is the same person as the person of the face, and the importance of each image is calculated to reflect the importance of the object included in each image.
- the object importance level evaluation unit in the image management apparatus determines the object importance level of an object by determining the number of objects belonging to the same cluster as the cluster to which the object belongs, the object feature value of the object, and the feature value indicated by the cluster.
- the evaluation value for the cluster of the object which is calculated based on the degree of similarity indicating how similar the cluster feature value that is the representative value of is, and a cluster different from the cluster to which the object belongs, Based on the number of objects belonging to the other cluster and the evaluation value for the other cluster of the object calculated based on the similarity between the feature amount of the object and the cluster feature amount of the other cluster May be evaluated.
- a cluster different from the cluster to which the object belongs may be the same person as the person belonging to the cluster (accuracy) Is calculated from the similarity, and the object importance is evaluated by weighting the number of objects belonging to the cluster with the accuracy obtained from the similarity, so the object importance closer to when the object is correctly determined to be the same person Can be evaluated. Therefore, the image importance can be evaluated with higher accuracy.
- the object importance level evaluation means in the image management apparatus when the second object is included in the image including the first object to be evaluated, the first object of the first object is An evaluation value for the first cluster, which is a cluster to which the first object belongs or a cluster to which the first object does not belong, and a cluster to which the object belonging to the first cluster and the second object belong in the image group obtained by the image obtaining unit. Calculate based on the degree of co-occurrence of the first cluster and the second cluster to the extent that an event that both objects belonging to the same second cluster are included in one image occurs. It is good.
- the object importance level is increased even if the person is the same person.
- the possibility of being the same person is calculated using not only the similarity of the feature amount but also the co-occurrence relationship between the persons.
- the object importance level evaluation unit in the image management apparatus includes the first cluster and the second object when the second object is included in the image including the first object to be evaluated.
- the degree of co-occurrence with the same second cluster as the belonging cluster is calculated as a ratio to the number of objects belonging to the second cluster, the certainty factor for the first cluster of the first object, the first cluster and the The degree of co-occurrence with the second cluster, calculated as a ratio of the total number of objects detected by the object detection means, the degree of support of the first object with respect to the first cluster, and the first of the first object
- the first class of the first object calculated based on the similarity to one cluster.
- the evaluation value for the first cluster of the first object, the and accuracy with respect to the first cluster of the first object may be calculated from the number of objects belonging to the first cluster. According to this configuration, the accuracy of the first object with respect to the first cluster based on the certainty factor of the first object with respect to the first cluster, the support degree of the first object with respect to the first cluster, and the similarity of the first object with respect to the first cluster. Therefore, the evaluation value for the first cluster of the first object can be calculated based on the calculated accuracy.
- the object importance level evaluation unit in the image management apparatus further includes the accuracy of the first cluster when the second object is included in the image including the first object to be evaluated.
- the reliability of the first cluster which is calculated based on the difference between the cluster feature and the object feature of each object belonging to the first cluster, and indicates how much each object feature is concentrated on the cluster feature It is also possible to calculate based on the reliability of the second cluster calculated based on the difference between the cluster feature amount of the second cluster and the object feature amount of each object belonging to the second cluster. According to this configuration, since the accuracy of the first object with respect to the first cluster is calculated based on the reliability of the first cluster and the reliability of the second cluster, the accuracy of the first object with respect to the first cluster is increased. The accuracy can be calculated.
- the object importance level evaluation means in the image management device when the second object is included in the image including the first object to be evaluated, the accuracy of the first object with respect to the first cluster.
- the certainty factor of the first object for the first cluster is the certainty factor of the first object for the first cluster, the support level of the first object for the first cluster, the similarity of the first object to the first cluster, and the trust of the first cluster. It is good also as calculating using logistic regression which uses degree and the reliability of the said 2nd cluster as an explanatory variable.
- the coefficient of each explanatory variable is determined based on the magnitude of the influence on the accuracy calculation based on past actual measurement values and the like. Therefore, according to this configuration, the accuracy of the first object with respect to the first cluster can be calculated with higher accuracy.
- the object importance level evaluation means in the image management apparatus when the second object is included in the image including the first object to be evaluated, the accuracy of the first object with respect to the first cluster.
- Logistic regression using the confidence level of the first object for the first cluster, the support level of the first object for the first cluster, and the similarity of the first object to the first cluster as explanatory variables It is good also as calculating using.
- the coefficient of each explanatory variable is determined based on the magnitude of the influence on the accuracy calculation based on past actual measurement values and the like. Therefore, according to this configuration, the accuracy of the first object with respect to the first cluster can be calculated with higher accuracy.
- the object importance level evaluation unit in the image management apparatus when an object other than the object is not included in the image including the object to be evaluated, includes the cluster to which the object belongs.
- the evaluation value for the cluster to which the object does not belong is further increased to the extent that an event that the object belonging to the cluster is included alone in one image in the image group obtained by the image obtaining unit. It may be calculated based on the non-co-occurrence degree of the cluster. According to this configuration, since the evaluation value of the cluster to which the object belongs and the cluster to which the object does not belong is calculated based on the degree of non-co-occurrence that is the extent to which the object is included in the image alone, Even if it is included in the image alone, the possibility that the object belongs to the cluster can be calculated.
- the object importance level evaluation means in the image management apparatus when an object other than the object is not included in the image including the object to be evaluated, the cluster to which the object belongs or the object does not belong
- the certainty factor calculated as the ratio of the non-co-occurrence of a cluster to the number of objects belonging to the cluster and the ratio of the non-co-occurrence of the cluster to the number of all objects detected by the object detection means Is calculated based on the degree of support and the similarity of the object to the cluster, and the accuracy of the object with respect to the cluster is calculated, and the cluster to which the object belongs and the object does not belong to the object to be evaluated
- the evaluation value, and accuracy with respect to the cluster of the object may be calculated from the number of objects belonging to the cluster.
- the first object based on the non-co-occurrence certainty for the first cluster, the first object non-co-occurrence support for the first cluster, and the similarity of the first object to the first cluster. Since the accuracy of one object with respect to the first cluster is calculated, the possibility that the first object belongs to the first cluster can be calculated.
- the object detection means in the image management apparatus may extract the object feature amount based on a reference for extracting a feature amount related to a human face. According to this configuration, since the human face that strongly shows the characteristics of the person is extracted as an object, the possibility that the object can be classified more accurately is increased. It can be ranked as a high image.
- the image management apparatus further extracts an object feature amount, which is a feature amount related to a distribution of pixel values of a plurality of pixels corresponding to the object included in the image, based on a predetermined reference in each image.
- object classification means for classifying each object detected in each image acquired by the image acquisition means into one of a plurality of object clusters according to the object feature amount of each object.
- the object importance level evaluation means further includes evaluation values for the cluster to which the object belongs and a cluster to which the object does not belong, and one or more objects are included in the image including the object. In the image group acquired by the image acquisition means, the object belonging to the cluster and the object are included.
- Co-occurrence of the cluster and the object cluster to the extent that an event occurs in which an object belonging to the object cluster to which one of the objects included in the image is included is included in the same image. It may be calculated based on the degree.
- the object cluster is a unit of classification when an object is classified based on a predetermined standard, and each object cluster corresponds to a range of different object feature amounts.
- the object classification means in the image management apparatus may classify each object into clusters by the K-means method. According to this configuration, since the K-means method is used for classifying objects, each object can be classified into clusters by a simple algorithm.
- the object importance level evaluation unit in the image management apparatus is configured such that the object importance level includes a cluster feature amount that is a representative value of a feature amount indicated by a cluster to which the object belongs, and the object feature of each object belonging to the cluster. It is also possible to calculate based on the reliability of the cluster, which is calculated based on the difference from the amount and indicates how much each object feature amount is concentrated on the cluster feature amount. According to this configuration, since the object importance is calculated based on the reliability of the cluster to which the object belongs and the number of objects belonging to the cluster, it is higher than calculating the object importance based only on the number of objects belonging to the cluster. It is possible to calculate the object importance with accuracy.
- the image management apparatus and the image management method according to the present invention are applied to a device for storing still images or moving images, a reproduction device for still images and moving images, a digital camera, a photographing device such as a camera-equipped mobile phone or a movie camera, and a PC. be able to.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Image Analysis (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
<1.概要>
図1は、本発明の実施の形態である画像管理装置100と、それに関連する装置により構成される画像管理システム10の例を示すシステム構成図である。画像管理装置100は、撮影装置110及び表示装置120と接続される。また、画像管理装置100はコントローラ130によるユーザの操作を受けることもできる。
<2.実施形態1>
実施形態1では、オブジェクトとして人物の顔を検出し、オブジェクト重要度として人物の重要度を評価して、そのオブジェクト重要度に基づいて画像の画像重要度を評価してランキングする画像管理装置100について説明する。
<2-1.構成>
本発明の実施形態1である画像管理装置100は、ハードウェア構成に関しては、画像を入力するUSB入力端子、映像を出力するHDMI出力端子、データとプログラムとを記憶するメモリ及びプログラムを実行するプロセッサを備える。
<2-2.データ>
次に、上述の構成を備える画像管理装置100が取り扱う情報について説明する。
<2-2-1.画像群>
画像群300は画像管理装置100がランキングする対象の複数の画像である。画像群300は、画像取得部201で撮影装置110から入力された複数の画像から生成され、画像記憶部202に格納されて、オブジェクト検出部203及び画像出力部217により使用される。
<2-2-2.オブジェクト出現情報>
オブジェクト出現情報500は、どの画像においてどのオブジェクトが検出されたのかを示す情報である。オブジェクト出現情報500は、オブジェクト検出部203により生成され、オブジェクト出現情報記憶部205に格納されて、共起情報生成部210、確度算出部212及び画像重要度評価部215により使用される。
<2-2-3.特徴量>
特徴量とは、画像中の複数の画素に係る画素値の分布に係る特徴を示したものである。例えば特徴量は画像の特徴を示す複数の数値を成分とするベクトルである。画像の特徴には、Gaborフィルタを用いて得られる画像データの画素値の分布の周期性や方向性などがあり、人物の顔に係る特徴量の場合、画素値の分布の周期性や方向性などから目と認識された点2つの間の距離や、鼻と認識された点と口と認識された点との距離などの量を成分とすることができる。
<2-2-4.クラスタ>
クラスタに関するデータとして、クラスタID703、クラスタ特徴量702、クラスタ分類情報900について説明する。
<2-2-5.共起情報>
ここではまず共起及び非共起について説明し、続いて共起情報1100について説明する。
<2-2-6.類似度>
類似度1201は、オブジェクトのオブジェクト特徴量601とクラスタのクラスタ特徴量702とがどの程度近い値を持つかを示す値である。類似度1201は、類似度算出部211によって生成され、確度算出部212により使用される。
<2-2-7.確度>
確度1301は、オブジェクトのクラスタとの関連の高さを、類似度1201だけでなく共起情報1100をも用いて示した値である。確度1301は、確度算出部212により生成され、評価値算出部213で使用される。確度1301を用いることで、類似度1201だけから関連の高さを判断するよりも精度良く関連の高さを評価することができる。
<2-2-8.評価値>
評価値1401は、オブジェクトとクラスタとの組み合わせに対して算出される重要度であり、後述するオブジェクト重要度1501は評価値1401に基づいて評価される。評価値1401は、評価値算出部213により生成され、オブジェクト重要度評価部214により使用される。
<2-2-8.オブジェクト重要度>
オブジェクト重要度1501は、オブジェクトごとに評価される重要度であり、後述する画像重要度1601はオブジェクトのオブジェクト重要度1501に基づいて評価される。オブジェクト重要度1501は、オブジェクト重要度評価部214により生成され、画像重要度評価部215に使用される。
<2-2-9.画像重要度>
画像重要度1601は、画像ごとに評価される重要度であり、画像管理装置100は画像群300を各画像の画像重要度1601に基づいてランキングする。画像重要度1601は、画像重要度評価部215により生成され、画像ランキング部216に使用される。
<2-3.動作>
次に、本発明に係る画像管理装置100の動作について説明する。
<2-3-1.動作の概要>
図18のフローチャートを用いて画像管理装置100が行う動作の概要を説明する。
<2-3-2.オブジェクトの検出処理>
ここではオブジェクト検出部203が行うオブジェクトの検出処理(S1802)について述べる。
<2-3-3.オブジェクトの分類処理>
ここではオブジェクト分類部207が行うオブジェクトの分類処理(S1803)について述べる。
<2-3-4.共起情報の生成処理>
ここでは共起情報生成部210が画像群300について行う共起情報1100の生成処理(S1807)について述べる。
<2-3-5.評価値の算出処理>
ここでは評価値算出部213が行う評価値1401の算出処理(S1808)について述べる。
<2-3-6.確度の算出処理>
ここでは確度算出部212が行う確度1301の算出処理について述べる。
<2-4.実施形態1の効果>
実施形態1に係る画像管理装置100は、画像に含まれるオブジェクトである人物の顔の重要度に当たるオブジェクト重要度を、画像に含まれる顔の人物と同一人物であることを示すクラスタに属するオブジェクトの出現回数に基づいて求め、そのオブジェクト重要度に基づいて画像の重要度を求めて、その画像重要度によって画像をランキングすることで、出現回数の多い人物が含まれている画像を上位に並べることができる。ユーザは関心の高い人物が写っている画像を多く保有すると考えられるため、ランキングの上位から順に画像を探していくことで、膨大な枚数の画像から関心の高い重要な人物が写っている画像、つまり重要度が高い画像を容易に選択できる。
<3.実施形態2>
本発明の実施形態2として、実施形態1において人物の顔のオブジェクトが属するクラスタ間の共起関係を用いて確度1301を算出していたところを、人物の顔と人物以外の物体との共起関係を用いて確度1301を算出する方式に変更した変形画像管理装置2300について説明する。
<3-1.構成>
変形画像管理装置2300のハードウェア構成は、実施形態1である画像管理装置100と同じである。
<3-2.データ>
<3-2-1.物体出現情報>
物体出現情報2600は、どの画像においてどの共起物体が検出されたのかを示す情報である。物体出現情報2600は、物体検出部2401により生成され、物体出現情報記憶部2402に格納されて、共起情報生成部210a及び確度算出部212aにより使用される。
<3-2-2.物体特徴量>
物体特徴量とは、物体に係る特徴量のことである。例えば物体特徴量は画像の特徴を示す複数の数値を成分とするベクトルである。自動車に係る特徴量の場合、車輪と認識される画素値(例えば黒色を示す画素値)を持つ画素が円周状に並んでいた場合には、その円の直径や中心点の位置などを自動車の車輪の特徴量とすることができる。そして車輪の特徴量や窓の特徴量などを成分として含むベクトルを自動車の物体特徴量とすることができる。
<3-2-3.物体クラスタ>
物体クラスタに関するデータとして、物体クラスタID2701及び物体クラスタ分類情報2700について説明する。
<3-2-4.共起情報>
実施形態2における共起情報2800は、クラスタと物体クラスタとの間の関係を示す情報である。
<3-3.動作>
図29は、変形画像管理装置2300の動作を示したフローチャートである。ただし、画像管理装置100と同じ動作の部分は図18と同じ符号を割り当てている。
<3-3-1.共起物体の検出処理>
ここでは、物体検出部2401が行う共起物体の検出処理(S2901)について述べる。
<3-3-2.共起物体の分類処理>
ここでは、物体分類部2403が行う共起物体の分類処理(S2602)について述べる。
<3-3-3.共起情報の生成処理>
ここでは、共起情報生成部210aが画像群300について行う共起情報2800の生成処理(S1807a)について述べる。
<3-3-4.評価値の算出処理>
ここでは、評価値算出部213が行う評価値1401の算出処理(S1808a)について述べる。
<3-3-5.確度の算出処理>
ここでは確度算出部212aが行う確度1301の算出処理について述べる。
<3-4.実施形態2の効果>
実施形態2に係る変形画像管理装置2300は、画像管理装置100と同様に、膨大な枚数の画像から関心の高い重要な人物が写っている画像を容易に選択できる。
<3-5.変形例(実施形態1、2の組み合わせ)>
実施形態2の変形例として、クラスタの物体クラスタに対する共起関係を用いた確度1301の算出処理に、実施形態1で行うこととしたクラスタ間の共起関係を用いた確度1301の算出処理を加えた画像管理装置について説明する。
<4.実施形態3>
本発明の実施形態3として、実施形態1において類似度と確信度と支持度とに基づいて確度1301を算出していたところを、更に、クラスタの信頼度にも基づいて確度1301を算出する方式に変更した変形画像管理装置3500について説明する。ここで、クラスタの信頼度とは、クラスタに属する各オブジェクトのオブジェクト特徴量が、当該クラスタのクラスタ特徴量にどの程度集中しているか、つまりクラスタに属する各オブジェクトのオブジェクト特徴量の偏差の大きさが総合的にどの程度小さいかを示すものである。例えばクラスタC001では、図7に示すように、クラスタC001のクラスタ特徴量702aと、クラスタC001に属する各オブジェクトのオブジェクト特徴量である601a、603c及び601fとの距離が特徴量の差の大きさを示す。つまり、クラスタ特徴量と当該クラスタに属する各オブジェクトのオブジェクト特徴量との距離が近いほど、各オブジェクトのオブジェクト特徴量のクラスタ特徴量に対する集中度合いが高く、クラスタの信頼度が高いことを示す。
<4-1.構成>
変形画像管理装置3500のハードウェア構成は、実施形態1の画像管理装置100と同じである。
<4-2.データ>
<4-2-1.信頼度情報>
信頼度情報3600は、クラスタごとの信頼度3601を示す情報である。信頼度算出部3501により生成、更新され、確度算出部212bにより使用される。
図37は、変形画像管理装置3500の動作を示したフローチャートである。ただし、画像管理装置100と同じ動作の部分は図18と同じ符号を割り当てている。
<4-3-1.信頼度の算出処理>
以下、信頼度3601の算出処理について説明する。
<4-3-2.確度の算出処理>
ここでは確度算出部212bが行う確度1301の算出処理について述べる。
<5.補足>
以上、本発明に係る画像管理装置について実施形態に基づいて説明したが、本発明は上述の実施形態で示した通りの画像管理装置に限られないことはもちろんである。
(16)実施形態1~3では、確度1301の算出にロジスティック回帰分析を用いたが、ロジスティック回帰分析に限る必要はない。類似度と共起情報とを用いて、または類似度と共起情報と信頼度とを用いて別の方法で算出しても良い。また、1つのオブジェクトの全クラスタに対する確度1301の総和は必ずしも1にはならないが、総和が1になるように確度1301を正規化しても良い。
備える。この構成により、画像管理装置は、前記所定の基準が人物の顔の特徴量を定めるものであれば、画像に含まれるオブジェクトである人物の顔の重要度に当たるオブジェクト重要度を、画像に含まれる顔の人物と同一人物であることを示すクラスタに属するオブジェクトの出現回数に基づいて求め、各画像に含まれるオブジェクトの重要度を反映するよう各画像の重要度を求めて、その画像重要度によって画像をランキングすることで、出現回数の多い人物が含まれている画像を上位に並べることができる。ユーザは、ランキングの上位から順に画像を探していくことで、膨大な枚数の画像から関心の高い重要な人物が写っている画像、つまり重要度が高い画像を容易に選択できる。
100 画像管理装置
110 撮影装置
120 表示装置
130 コントローラ
201 画像取得部
202 画像記憶部
203 オブジェクト検出部
204 テンプレート記憶部
205 オブジェクト出現情報記憶部
206 オブジェクト特徴量記憶部
207 オブジェクト分類部
208 クラスタ特徴量記憶部
209 クラスタ分類情報記憶部
210、210a 共起情報生成部
211 類似度算出部
212、212a、212b 確度算出部
213 評価値算出部
214 オブジェクト重要度評価部
215 画像重要度評価部
216 画像ランキング部
217 画像出力部
218 操作入力部
700 特徴量空間
701 クラスタ
702 クラスタ特徴量
703 クラスタID
704 クラスタの境界
2300 変形画像管理装置
2301 物体部
2401 物体検出部
2402 物体出現情報記憶部
2403 物体分類部
2404 物体クラスタ分類情報記憶部
3500 変形画像管理装置
3501 信頼度算出部
Claims (17)
- 画像を取得する画像取得手段と、
前記画像取得手段で取得した各画像において、画像に含まれるオブジェクトに該当する複数画素の画素値の分布に係る特徴量であるオブジェクト特徴量を所定の基準に基づいて抽出することによりオブジェクトを検出するオブジェクト検出手段と、
前記画像取得手段で取得した各画像において検出された各オブジェクトを、各オブジェクトのオブジェクト特徴量に応じて、複数のクラスタのいずれかに分類するオブジェクト分類手段と、
各オブジェクトについて、当該オブジェクトと同じクラスタに共に属するオブジェクトの個数の大小に基づいて、オブジェクトの重要度であるオブジェクト重要度を評価するオブジェクト重要度評価手段と、
一の画像に含まれるオブジェクトのオブジェクト重要度に基づいて、当該一の画像の重要度を評価する画像重要度評価手段と
を備えることを特徴とする画像管理装置。 - 前記オブジェクト重要度評価手段は、
オブジェクトのオブジェクト重要度を、
当該オブジェクトが属するクラスタと同じクラスタに属するオブジェクトの個数と、当該オブジェクトのオブジェクト特徴量と当該クラスタが示す特徴量の代表値であるクラスタ特徴量とがどの程度類似しているかを示す類似度とに基づいて算出される、当該オブジェクトの当該クラスタに対する評価値と、
当該オブジェクトが属するクラスタとは別のクラスタについて、当該別のクラスタに属するオブジェクトの個数と、当該オブジェクトの特徴量と当該別のクラスタのクラスタ特徴量との類似度とに基づいて算出される、当該オブジェクトの当該別のクラスタに対する評価値とに基づいて評価する
ことを特徴とする請求項1に記載の画像管理装置。 - 前記オブジェクト重要度評価手段は、
評価対象である第1オブジェクトが含まれる画像中に、第2オブジェクトが共に含まれる場合には、
第1オブジェクトの、当該第1オブジェクトが属するクラスタ又は当該第1オブジェクトが属さないクラスタである第1クラスタに対する評価値を更に、
前記画像取得手段で取得した画像群の中で、当該第1クラスタに属するオブジェクトと、第2オブジェクトが属するクラスタと同じ第2クラスタに属するオブジェクトとが一の画像中に共に含まれるという事象の発生している程度である、当該第1クラスタと当該第2クラスタとの共起度にも基づいて算出する
ことを特徴とする請求項2に記載の画像管理装置。 - 前記オブジェクト重要度評価手段は、
評価対象である第1オブジェクトが含まれる画像中に、第2オブジェクトが共に含まれる場合には、
前記第1クラスタと、第2オブジェクトが属するクラスタと同じ第2クラスタとの共起度の、当該第2クラスタに属するオブジェクトの個数に対する割合として算出される、当該第1オブジェクトの当該第1クラスタに対する確信度と、
当該第1クラスタと当該第2クラスタとの共起度の、前記オブジェクト検出手段により検出された全オブジェクトの個数に対する割合として算出される、当該第1オブジェクトの当該第1クラスタに対する支持度と、
当該第1オブジェクトの当該第1クラスタに対する類似度と
に基づいて計算される、当該第1オブジェクトの当該第1クラスタに対する確度を算出し、
第1オブジェクトの第1クラスタに対する評価値を、
当該第1オブジェクトの当該第1クラスタに対する確度と、
当該第1クラスタに属するオブジェクトの個数とから算出する
ことを特徴とする請求項3に記載の画像管理装置。 - 前記オブジェクト重要度評価手段は、
評価対象である第1オブジェクトが含まれる画像中に、第2オブジェクトが共に含まれる場合には、
前記確度を、更に、第1クラスタの前記クラスタ特徴量と第1クラスタに属する各オブジェクトの前記オブジェクト特徴量との差に基づいて算出され、各オブジェクト特徴量がどの程度前記クラスタ特徴量に集中しているかを示す第1クラスタの信頼度と、第2クラスタの前記クラスタ特徴量と第2クラスタに属する各オブジェクトの前記オブジェクト特徴量との差に基づいて算出される第2クラスタの信頼度とにも基づいて算出する
ことを特徴とする請求項4に記載の画像管理装置。 - 前記オブジェクト重要度評価手段は、
評価対象である第1オブジェクトが含まれる画像中に、第2オブジェクトが共に含まれる場合には、
当該第1オブジェクトの第1クラスタに対する確度を、
当該第1オブジェクトの当該第1クラスタに対する確信度と、
当該第1オブジェクトの当該第1クラスタに対する支持度と、
当該第1オブジェクトの当該第1クラスタに対する類似度と、
当該第1クラスタの信頼度と、
当該第2クラスタの信頼度と
を説明変数として用いるロジスティック回帰を用いて計算する
ことを特徴とする請求項5に記載の画像管理装置。 - 前記オブジェクト重要度評価手段は、
評価対象である第1オブジェクトが含まれる画像中に、第2オブジェクトが共に含まれる場合には、
当該第1オブジェクトの第1クラスタに対する確度を、
当該第1オブジェクトの当該第1クラスタに対する確信度と、
当該第1オブジェクトの当該第1クラスタに対する支持度と、
当該第1オブジェクトの当該第1クラスタに対する類似度と
を説明変数として用いるロジスティック回帰を用いて計算する
ことを特徴とする請求項4に記載の画像管理装置。 - 前記オブジェクト重要度評価手段は、
評価対象のオブジェクトが含まれる画像中に、当該オブジェクト以外にオブジェクトが含まれない場合には、
評価対象のオブジェクトの、当該オブジェクトが属するクラスタ及び当該オブジェクトが属さないクラスタに対する評価値を更に、
前記画像取得手段で取得した画像群の中で、一の画像中に当該クラスタに属するオブジェクトが単独で含まれるという事象の発生している程度である、当該クラスタの非共起度に基づいて算出する
ことを特徴とする請求項2に記載の画像管理装置。 - 前記オブジェクト重要度評価手段は、
評価対象のオブジェクトが含まれる画像中に、当該オブジェクト以外にオブジェクトが含まれない場合には、
当該オブジェクトが属するクラスタ又は当該オブジェクトが属さないクラスタの非共起度の、当該クラスタに属するオブジェクトの個数に対する割合として算出される確信度と、
当該クラスタの非共起度の、前記オブジェクト検出手段により検出された全オブジェクトの個数に対する割合として算出される支持度と、
当該オブジェクトの当該クラスタに対する類似度と
に基づいて計算される、当該オブジェクトの当該クラスタに対する確度を算出し、
評価対象のオブジェクトの、当該オブジェクトが属するクラスタ及び当該オブジェクトが属さないクラスタに対する評価値を、
当該オブジェクトの当該クラスタに対する確度と、
当該クラスタに属するオブジェクトの個数とから算出する
ことを特徴とする請求項8に記載の画像管理装置。 - 前記オブジェクト検出手段は、
人物の顔に係る特徴量を抽出する基準によりオブジェクト特徴量を抽出する
ことを特徴とする請求項2に記載の画像管理装置。 - 前記画像管理装置は更に、
各画像において、画像に含まれる物体に該当する複数画素の画素値の分布に係る特徴量である物体特徴量を所定の基準に基づいて抽出することにより物体を検出する物体検出手段と、
前記画像取得手段で取得した各画像において検出された各物体を、各物体の物体特徴量に応じて、複数の物体クラスタのいずれかに分類する物体分類手段と
を備え、
前記オブジェクト重要度評価手段は、
評価対象のオブジェクトの、当該オブジェクトが属するクラスタ及び当該オブジェクトが属さないクラスタに対する評価値を更に、
当該オブジェクトが含まれる画像中に、1以上の物体が共に含まれる場合には、
前記画像取得手段で取得した画像群の中で、当該クラスタに属するオブジェクトと、当該オブジェクトが含まれる画像に当該オブジェクトと共に含まれるうちの1つの物体が属する物体クラスタに共に属する物体とが一の画像中に共に含まれるという事象の発生する程度である、当該クラスタと当該物体クラスタとの共起度に基づいて算出する
ことを特徴とする請求項10に記載の画像管理装置。 - 前記オブジェクト分類手段は、
K-means法により各オブジェクトをクラスタに分類する
ことを特徴とする請求項1に記載の画像管理装置。 - 前記オブジェクト重要度評価手段は、
前記オブジェクト重要度を、
前記オブジェクトが属するクラスタが示す特徴量の代表値であるクラスタ特徴量と、当該クラスタに属する各オブジェクトの前記オブジェクト特徴量との差に基づいて算出され、各オブジェクト特徴量がどの程度前記クラスタ特徴量に集中しているかを示すクラスタの信頼度にも基づいて算出する
ことを特徴とする請求項1に記載の画像管理装置。 - 画像を取得する画像取得ステップと、
前記画像取得ステップで取得した各画像において、画像に含まれるオブジェクトに該当する複数画素の画素値の分布に係る特徴量であるオブジェクト特徴量を所定の基準に基づいて抽出することによりオブジェクトを検出するオブジェクト検出ステップと、
前記画像取得ステップで取得した各画像において検出された各オブジェクトを、各オブジェクトのオブジェクト特徴量に応じて、複数のクラスタのいずれかに分類するオブジェクト分類ステップと、
各オブジェクトについて、当該オブジェクトと同じクラスタに共に属するオブジェクトの個数の大小に基づいて、オブジェクトの重要度であるオブジェクト重要度を評価するオブジェクト重要度評価ステップと、
一の画像に含まれるオブジェクトのオブジェクト重要度に基づいて、当該一の画像の重要度を評価する画像重要度評価ステップと
を備えることを特徴とする画像管理方法。 - 画像を取得する画像取得ステップと、
前記画像取得ステップで取得した各画像において、画像に含まれるオブジェクトに該当する複数画素の画素値の分布に係る特徴量であるオブジェクト特徴量を所定の基準に基づいて抽出することによりオブジェクトを検出するオブジェクト検出ステップと、
前記画像取得ステップで取得した各画像において検出された各オブジェクトを、各オブジェクトのオブジェクト特徴量に応じて、複数のクラスタのいずれかに分類するオブジェクト分類ステップと、
各オブジェクトについて、当該オブジェクトと同じクラスタに共に属するオブジェクトの個数の大小に基づいて、オブジェクトの重要度であるオブジェクト重要度を評価するオブジェクト重要度評価ステップと、
一の画像に含まれるオブジェクトのオブジェクト重要度に基づいて、当該一の画像の重要度を評価する画像重要度評価ステップと
を含む処理をコンピュータに実行させることを特徴とするプログラム。 - 画像を取得する画像取得ステップと、
前記画像取得ステップで取得した各画像において、画像に含まれるオブジェクトに該当する複数画素の画素値の分布に係る特徴量であるオブジェクト特徴量を所定の基準に基づいて抽出することによりオブジェクトを検出するオブジェクト検出ステップと、
前記画像取得ステップで取得した各画像において検出された各オブジェクトを、各オブジェクトのオブジェクト特徴量に応じて、複数のクラスタのいずれかに分類するオブジェクト分類ステップと、
各オブジェクトについて、当該オブジェクトと同じクラスタに共に属するオブジェクトの個数の大小に基づいて、オブジェクトの重要度であるオブジェクト重要度を評価するオブジェクト重要度評価ステップと、
一の画像に含まれるオブジェクトのオブジェクト重要度に基づいて、当該一の画像の重要度を評価する画像重要度評価ステップと
を含む処理をコンピュータに実行させるプログラムを記録している記録媒体。 - 画像を取得する画像取得手段と、
前記画像取得手段で取得した各画像において、画像に含まれるオブジェクトに該当する複数画素の画素値の分布に係る特徴量であるオブジェクト特徴量を所定の基準に基づいて抽出することによりオブジェクトを検出するオブジェクト検出手段と、
前記画像取得手段で取得した各画像において検出された各オブジェクトを、各オブジェクトのオブジェクト特徴量に応じて、複数のクラスタのいずれかに分類するオブジェクト分類手段と、
各オブジェクトについて、当該オブジェクトと同じクラスタに共に属するオブジェクトの個数の大小に基づいて、オブジェクトの重要度であるオブジェクト重要度を評価するオブジェクト重要度評価手段と、
一の画像に含まれるオブジェクトのオブジェクト重要度に基づいて、当該一の画像の重要度を評価する画像重要度評価手段と
を備えることを特徴とする集積回路。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11734479.6A EP2528034B1 (en) | 2010-01-22 | 2011-01-13 | Image management device, image management method, program, recording medium, and integrated circuit |
CN201180001520.9A CN102792332B (zh) | 2010-01-22 | 2011-01-13 | 图像管理装置、图像管理方法及集成电路 |
US13/256,505 US20120002881A1 (en) | 2010-01-22 | 2011-01-13 | Image management device, image management method, program, recording medium, and integrated circuit |
JP2011535339A JP5330530B2 (ja) | 2010-01-22 | 2011-01-13 | 画像管理装置、画像管理方法、プログラム、記録媒体及び集積回路 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-012468 | 2010-01-22 | ||
JP2010012468 | 2010-01-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011089872A1 true WO2011089872A1 (ja) | 2011-07-28 |
Family
ID=44306674
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/000150 WO2011089872A1 (ja) | 2010-01-22 | 2011-01-13 | 画像管理装置、画像管理方法、プログラム、記録媒体及び集積回路 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20120002881A1 (ja) |
EP (1) | EP2528034B1 (ja) |
JP (1) | JP5330530B2 (ja) |
CN (1) | CN102792332B (ja) |
WO (1) | WO2011089872A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014132613A1 (ja) * | 2013-02-26 | 2014-09-04 | 日本電気株式会社 | 診断支援システム、診断支援方法及びそのプログラム |
WO2018042606A1 (ja) * | 2016-09-01 | 2018-03-08 | 株式会社日立製作所 | 分析装置、分析システムおよび分析方法 |
JPWO2017069231A1 (ja) * | 2015-10-23 | 2018-10-04 | 国立大学法人大阪大学 | 人体における治療後の形態予測方法及びシステム |
CN116403080A (zh) * | 2023-06-09 | 2023-07-07 | 江西云眼视界科技股份有限公司 | 一种人脸聚类评价方法、系统、计算机及可读存储介质 |
CN117786434A (zh) * | 2023-11-22 | 2024-03-29 | 太极计算机股份有限公司 | 一种集群管理方法 |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002102820A1 (en) | 2001-06-20 | 2002-12-27 | Nuevolution A/S | Nucleoside derivatives for library preparation |
ATE414769T1 (de) * | 2002-03-15 | 2008-12-15 | Nuevolution As | Eine verbesserte methode zur synthese von matritzenabhängigen molekülen |
AU2003247266A1 (en) | 2002-08-01 | 2004-02-23 | Nuevolution A/S | Multi-step synthesis of templated molecules |
US9049419B2 (en) | 2009-06-24 | 2015-06-02 | Hewlett-Packard Development Company, L.P. | Image album creation |
WO2011127933A1 (en) | 2010-04-16 | 2011-10-20 | Nuevolution A/S | Bi-functional complexes and methods for making and using such complexes |
US8923629B2 (en) * | 2011-04-27 | 2014-12-30 | Hewlett-Packard Development Company, L.P. | System and method for determining co-occurrence groups of images |
US8958645B2 (en) * | 2012-04-19 | 2015-02-17 | Canon Kabushiki Kaisha | Systems and methods for topic-specific video presentation |
WO2014068567A1 (en) * | 2012-11-02 | 2014-05-08 | Itzhak Wilf | Method and system for predicting personality traits, capabilities and suggested interactions from images of a person |
US20140169687A1 (en) * | 2012-12-13 | 2014-06-19 | Htc Corporation | Image search systems and methods |
JP6213557B2 (ja) * | 2013-03-01 | 2017-10-18 | 日本電気株式会社 | 情報処理装置、そのデータ処理方法、およびプログラム |
US9369662B2 (en) * | 2013-04-25 | 2016-06-14 | Microsoft Technology Licensing, Llc | Smart gallery and automatic music video creation from a set of photos |
JP6009481B2 (ja) | 2014-03-11 | 2016-10-19 | 富士フイルム株式会社 | 画像処理装置、重要人物判定方法、画像レイアウト方法ならびにプログラムおよび記録媒体 |
US10002310B2 (en) * | 2014-04-29 | 2018-06-19 | At&T Intellectual Property I, L.P. | Method and apparatus for organizing media content |
JP6660119B2 (ja) | 2015-08-07 | 2020-03-04 | キヤノン株式会社 | 情報処理装置、情報処理方法、並びにプログラム |
CN107924465B (zh) * | 2016-03-18 | 2021-09-10 | Jvc 建伍株式会社 | 物体识别装置、物体识别方法以及存储介质 |
JP6798183B2 (ja) * | 2016-08-04 | 2020-12-09 | 株式会社リコー | 画像解析装置、画像解析方法およびプログラム |
GB2553775A (en) * | 2016-09-09 | 2018-03-21 | Snell Advanced Media Ltd | Method and apparatus for ordering images |
US10909341B2 (en) * | 2016-10-26 | 2021-02-02 | Datalogic Automation, Inc. | Data processing reduction in barcode reading systems with overlapping frames |
US11205089B2 (en) * | 2017-07-07 | 2021-12-21 | Nec Corporation | Object identification device, object identification method, and recording medium |
CN107832852B (zh) * | 2017-11-14 | 2021-03-02 | 深圳码隆科技有限公司 | 数据处理学习方法、系统以及电子设备 |
CN108596735A (zh) * | 2018-04-28 | 2018-09-28 | 北京旷视科技有限公司 | 信息推送方法、装置及系统 |
CN110826616B (zh) * | 2019-10-31 | 2023-06-30 | Oppo广东移动通信有限公司 | 信息处理方法及装置、电子设备、存储介质 |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002024229A (ja) * | 2000-07-03 | 2002-01-25 | Fuji Photo Film Co Ltd | 本人画像提供システム |
JP2002215643A (ja) * | 2001-01-15 | 2002-08-02 | Minolta Co Ltd | 画像分類プログラム、画像分類プログラムを記録したコンピュータ読み取り可能な記録媒体、画像分類方法および画像分類装置 |
JP2004046591A (ja) | 2002-07-12 | 2004-02-12 | Konica Minolta Holdings Inc | 画像評価装置 |
JP2004318603A (ja) * | 2003-04-17 | 2004-11-11 | Nippon Telegr & Teleph Corp <Ntt> | 画像検索方法、自分撮り推定装置および方法、並びに自分撮り推定プログラム |
JP2005020446A (ja) | 2003-06-26 | 2005-01-20 | Casio Comput Co Ltd | 画像撮影装置及びプログラム |
JP2005056387A (ja) * | 2003-07-18 | 2005-03-03 | Canon Inc | 画像処理装置、撮像装置、画像処理方法 |
JP2005107885A (ja) * | 2003-09-30 | 2005-04-21 | Casio Comput Co Ltd | 画像分類装置及び画像分類プログラム |
JP2005148900A (ja) * | 2003-11-12 | 2005-06-09 | Nippon Telegr & Teleph Corp <Ntt> | 画像分類装置、画像分類方法、および、プログラム |
JP2006523334A (ja) * | 2003-02-06 | 2006-10-12 | センターフレーム・リミテッド・ライアビリティ・カンパニー | 特定の電子画像をユーザに配布する方法 |
JP2006318034A (ja) * | 2005-05-10 | 2006-11-24 | Fuji Photo Film Co Ltd | 画像選択方法、画像選択装置、プログラム、およびプリント注文受付機 |
JP2007174378A (ja) * | 2005-12-22 | 2007-07-05 | Fujifilm Corp | 画像ファイリング方法及びデジタルカメラ及び画像ファイリング処理プログラム及び動画記録再生装置 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7382903B2 (en) * | 2003-11-19 | 2008-06-03 | Eastman Kodak Company | Method for selecting an emphasis image from an image collection based upon content recognition |
WO2006080755A1 (en) * | 2004-10-12 | 2006-08-03 | Samsung Electronics Co., Ltd. | Method, medium, and apparatus for person-based photo clustering in digital photo album, and person-based digital photo albuming method, medium, and apparatus |
US8050453B2 (en) * | 2006-06-15 | 2011-11-01 | Omron Corporation | Robust object tracking system |
US8031914B2 (en) * | 2006-10-11 | 2011-10-04 | Hewlett-Packard Development Company, L.P. | Face-based image clustering |
US7903883B2 (en) * | 2007-03-30 | 2011-03-08 | Microsoft Corporation | Local bi-gram model for object recognition |
US7953690B2 (en) * | 2008-01-25 | 2011-05-31 | Eastman Kodak Company | Discovering social relationships from personal photo collections |
-
2011
- 2011-01-13 JP JP2011535339A patent/JP5330530B2/ja active Active
- 2011-01-13 EP EP11734479.6A patent/EP2528034B1/en active Active
- 2011-01-13 US US13/256,505 patent/US20120002881A1/en not_active Abandoned
- 2011-01-13 CN CN201180001520.9A patent/CN102792332B/zh active Active
- 2011-01-13 WO PCT/JP2011/000150 patent/WO2011089872A1/ja active Application Filing
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002024229A (ja) * | 2000-07-03 | 2002-01-25 | Fuji Photo Film Co Ltd | 本人画像提供システム |
JP2002215643A (ja) * | 2001-01-15 | 2002-08-02 | Minolta Co Ltd | 画像分類プログラム、画像分類プログラムを記録したコンピュータ読み取り可能な記録媒体、画像分類方法および画像分類装置 |
JP2004046591A (ja) | 2002-07-12 | 2004-02-12 | Konica Minolta Holdings Inc | 画像評価装置 |
JP2006523334A (ja) * | 2003-02-06 | 2006-10-12 | センターフレーム・リミテッド・ライアビリティ・カンパニー | 特定の電子画像をユーザに配布する方法 |
JP2004318603A (ja) * | 2003-04-17 | 2004-11-11 | Nippon Telegr & Teleph Corp <Ntt> | 画像検索方法、自分撮り推定装置および方法、並びに自分撮り推定プログラム |
JP2005020446A (ja) | 2003-06-26 | 2005-01-20 | Casio Comput Co Ltd | 画像撮影装置及びプログラム |
JP2005056387A (ja) * | 2003-07-18 | 2005-03-03 | Canon Inc | 画像処理装置、撮像装置、画像処理方法 |
JP2005107885A (ja) * | 2003-09-30 | 2005-04-21 | Casio Comput Co Ltd | 画像分類装置及び画像分類プログラム |
JP2005148900A (ja) * | 2003-11-12 | 2005-06-09 | Nippon Telegr & Teleph Corp <Ntt> | 画像分類装置、画像分類方法、および、プログラム |
JP2006318034A (ja) * | 2005-05-10 | 2006-11-24 | Fuji Photo Film Co Ltd | 画像選択方法、画像選択装置、プログラム、およびプリント注文受付機 |
JP2007174378A (ja) * | 2005-12-22 | 2007-07-05 | Fujifilm Corp | 画像ファイリング方法及びデジタルカメラ及び画像ファイリング処理プログラム及び動画記録再生装置 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014132613A1 (ja) * | 2013-02-26 | 2014-09-04 | 日本電気株式会社 | 診断支援システム、診断支援方法及びそのプログラム |
US10127473B2 (en) | 2013-02-26 | 2018-11-13 | Nec Corporation | Diagnosis assisting system, diagnosis assisting method, and program thereof |
JPWO2017069231A1 (ja) * | 2015-10-23 | 2018-10-04 | 国立大学法人大阪大学 | 人体における治療後の形態予測方法及びシステム |
US11065084B2 (en) | 2015-10-23 | 2021-07-20 | Osaka University | Method and system for predicting shape of human body after treatment |
US11617633B2 (en) | 2015-10-23 | 2023-04-04 | Osaka University | Method and system for predicting shape of human body after treatment |
WO2018042606A1 (ja) * | 2016-09-01 | 2018-03-08 | 株式会社日立製作所 | 分析装置、分析システムおよび分析方法 |
CN116403080A (zh) * | 2023-06-09 | 2023-07-07 | 江西云眼视界科技股份有限公司 | 一种人脸聚类评价方法、系统、计算机及可读存储介质 |
CN116403080B (zh) * | 2023-06-09 | 2023-08-11 | 江西云眼视界科技股份有限公司 | 一种人脸聚类评价方法、系统、计算机及可读存储介质 |
CN117786434A (zh) * | 2023-11-22 | 2024-03-29 | 太极计算机股份有限公司 | 一种集群管理方法 |
Also Published As
Publication number | Publication date |
---|---|
JP5330530B2 (ja) | 2013-10-30 |
EP2528034B1 (en) | 2019-03-06 |
EP2528034A1 (en) | 2012-11-28 |
JPWO2011089872A1 (ja) | 2013-05-23 |
CN102792332B (zh) | 2016-01-06 |
CN102792332A (zh) | 2012-11-21 |
EP2528034A4 (en) | 2017-04-26 |
US20120002881A1 (en) | 2012-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5330530B2 (ja) | 画像管理装置、画像管理方法、プログラム、記録媒体及び集積回路 | |
Zhang et al. | A gated peripheral-foveal convolutional neural network for unified image aesthetic prediction | |
CN110119711B (zh) | 一种获取视频数据人物片段的方法、装置及电子设备 | |
US9020244B2 (en) | Ranking and selecting representative video images | |
CN112989209B (zh) | 内容推荐方法、装置和存储介质 | |
CN108476259B (zh) | 基于用户行为的内容推荐的系统和方法 | |
US9058611B2 (en) | System and method for advertising using image search and classification | |
CN110390033A (zh) | 图像分类模型的训练方法、装置、电子设备及存储介质 | |
CN109871464B (zh) | 一种基于ucl语义标引的视频推荐方法与装置 | |
CN102150163B (zh) | 交互式图像选择方法 | |
US9137574B2 (en) | Method or system to predict media content preferences | |
WO2011097041A2 (en) | Recommending user image to social network groups | |
CN112100438A (zh) | 一种标签抽取方法、设备及计算机可读存储介质 | |
CN111708913B (zh) | 一种标签生成方法、设备及计算机可读存储介质 | |
US20200012862A1 (en) | Multi-model Techniques to Generate Video Metadata | |
US20180336931A1 (en) | Automatic and intelligent video sorting | |
CN113806588B (zh) | 搜索视频的方法和装置 | |
CN112434744B (zh) | 一种多模态特征融合模型的训练方法及装置 | |
CN112685596B (zh) | 视频推荐方法及装置、终端、存储介质 | |
EP2874102A2 (en) | Generating models for identifying thumbnail images | |
CN114845149B (zh) | 视频片段的剪辑方法、视频推荐方法、装置、设备及介质 | |
CN111625680A (zh) | 确定搜索结果的方法及装置 | |
Chen et al. | Automatic training image acquisition and effective feature selection from community-contributed photos for facial attribute detection | |
US20110044530A1 (en) | Image classification using range information | |
CN112884866A (zh) | 一种黑白视频的上色方法、装置、设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180001520.9 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011535339 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11734479 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13256505 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011734479 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |