WO2011099108A1 - 画像評価装置、画像評価方法、プログラム、集積回路 - Google Patents
画像評価装置、画像評価方法、プログラム、集積回路 Download PDFInfo
- Publication number
- WO2011099108A1 WO2011099108A1 PCT/JP2010/007300 JP2010007300W WO2011099108A1 WO 2011099108 A1 WO2011099108 A1 WO 2011099108A1 JP 2010007300 W JP2010007300 W JP 2010007300W WO 2011099108 A1 WO2011099108 A1 WO 2011099108A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- value
- importance
- calculating
- indicating
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
- G06V10/993—Evaluation of the quality of the acquired pattern
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/30—Scenes; Scene-specific elements in albums, collections or shared content, e.g. social network photos or video
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
Definitions
- a method for evaluating and ranking individual images is known in order to assist a user in easily finding an image that he / she wants to see from such a large number of images.
- the image that the user expects to be highly evaluated is not an image in which the face of the person being photographed is good, but an image in which a person important to the user is photographed.
- the present invention has been made under such a background, and an object thereof is to provide an image evaluation apparatus capable of evaluating an image based on an object such as a person appearing in each image.
- An image evaluation apparatus is an image evaluation apparatus that evaluates each image for a plurality of images including the object, and each object is associated with cluster information indicating a cluster to which the object belongs.
- a first value calculating unit that calculates a first value based on a cluster to which the object belongs, and an object included in the one image in the one image.
- Second value calculating means for calculating a second value indicating the appearing feature, the importance of one or more objects appearing in the one image based on the calculated first value and second value, and the one value
- an evaluation means for calculating the importance of the image.
- the image evaluation apparatus can evaluate an image based on an object such as a person appearing in each image.
- the evaluation means includes an object importance degree calculating means for calculating the importance degree of the object appearing in the one image based on the calculated first value and second value, and the calculated object importance degree. And image importance calculation means for calculating the importance indicating the evaluation of the one image.
- the first value calculating means calculates a frequency of appearance of an object of a cluster to which an object included in one image appears among all objects included in the plurality of images
- the second value calculating means Calculates, for an object included in the one image, an occupancy indicating an area occupied by the object in the one image
- the object importance calculating unit calculates the occupancy based on the calculated frequency and occupancy. The importance of the object may be calculated.
- the importance is calculated using both the calculated frequency based on the object included in one image and the occupancy indicating the area occupied by the object, the importance according to the user's expectation is calculated. It can contribute to the calculation of.
- the frequency calculation means includes a first frequency indicating the frequency of the object included in the cluster to which the first object belongs, and the second object.
- a second occupancy level indicating an area occupied in the one image is calculated, and the object importance level calculating means is a first level of importance level of the first object based on the first frequency and the first occupancy level.
- An importance is calculated, a second importance that is an importance of the second object is calculated based on the second frequency and the second occupancy, and the image importance Calculating means, on the basis of the first importance and the second importance may be to calculate the importance degree indicating an evaluation of the one image.
- the importance of the image is calculated based on the importance of both objects. For example, an image in which a plurality of persons appear Can be calculated as a high value.
- the object may be a face of a person cut out from the one image.
- This configuration is particularly suitable when, for example, a family group photo is to be evaluated.
- the evaluation unit may calculate the importance of both objects while propagating the importance between both objects.
- the plurality of images including the object is P (P is a natural number)
- the first value calculating unit is configured to calculate the Q number of clusters to which the object included in the P images belongs (Q is a natural number).
- the second value calculating means calculates, for each object appearing in the P images, a second value indicating the feature of the object appearing in each image, and the P images.
- An adjacency matrix that generates an adjacency matrix that represents a graph composed of link setting means that is set using a second value in the node, nodes created in the node creation means, and link values set by the link setting means
- An eigenvector calculating unit that calculates a main eigenvector of the generated adjacent matrix; and an importance level indicating evaluation of each of the N images based on the elements of the calculated main eigenvector. And importance level calculation means.
- the weight can be propagated between nodes constituting the graph.
- the importance can be propagated between the image and the object based on the link between the node A of the image and the node a of the object.
- the link setting means sets the value of the link from the node A to the node a among the values of the link between the nodes A and a using the second value, and the link from the node a to the node A. May be set without using the second value.
- the link value from the node A (the node of the image A) to the node a (the node of the cluster a to which the object included in the image A belongs) is set. For example, when the object is displayed in the image A, it is possible to propagate the weight according to the contents of the image A by increasing the weight of the link value.
- the second value calculated by the second value calculating means may be an occupancy indicating the area occupied by the object in each image.
- the second value calculating means calculates the degree of occupation of the background, which is an area where no object appears in each image
- the node creating means creates one node indicating a dummy
- the link setting means For a node A indicating a certain image A and one node Z indicating a dummy, the value of the link from the node A to the node Z among the values of the link between the nodes AZ is represented in the image A. It may be set using the background occupancy.
- the link setting means sets the value of the link from the node A to the node a and the value of the link from the node a to the node A using the second value for the value of the link between the nodes A and a. It does not matter.
- Each of the plurality of images is associated with a shooting date and time, and further includes a receiving unit that receives designation of a date and time range, and the first value calculating unit and the second value calculating unit are included in the specified range.
- the calculation is performed using the image, but the calculation may not be performed using an image that is not included in the specified range.
- the evaluation means can calculate the importance of the object and the image corresponding to the designated range.
- the image evaluation method is an image evaluation method for evaluating each image for a plurality of images each including an object, and obtains cluster information indicating clusters belonging to each object; and A first value calculation step of calculating a first value based on a cluster to which the object belongs for an object included in the one image; and a feature that the object appears in the one image for the object included in the one image.
- a second value calculating step for calculating a second value to be indicated, the importance of one or more objects appearing in the one image based on the calculated first value and the second value, And an evaluation step for calculating importance.
- the program according to the present invention is a program for causing a computer to execute an image evaluation process for evaluating each image for a plurality of images each including an object, and the image evaluation process includes a cluster belonging to each object.
- An acquisition step of acquiring the cluster information shown a first value calculation step of calculating a first value based on a cluster to which the object belongs, and an object included in the one image, A second value calculating step of calculating a second value indicating the feature of the object appearing in the one image, and one or more objects appearing in the one image based on the calculated first value and second value And an evaluation step for calculating the importance of the one image.
- the Rukoto is a program for causing a computer to execute an image evaluation process for evaluating each image for a plurality of images each including an object, and the image evaluation process includes a cluster belonging to each object.
- An acquisition step of acquiring the cluster information shown a first value calculation step of calculating a first value based on a cluster to which the object belongs, and an object included
- the integrated circuit according to the present invention is an image evaluation apparatus that evaluates each image for a plurality of images including the object, and each object is associated with cluster information indicating a cluster to which the object belongs.
- a first value calculating means for calculating a first value for an object included in one image based on a cluster to which the object belongs; and for an object included in the one image, the object is included in the one image
- Second value calculating means for calculating a second value indicating the feature that appears in the image, the importance of one or more objects appearing in the one image based on the calculated first value and second value,
- an evaluation means for calculating the importance of one image.
- the image evaluation system 1 includes an image evaluation device 2 and an SD memory card 4.
- the image evaluation apparatus 2 includes an image acquisition unit 10, an object extraction unit 11 (including an object feature amount extraction unit 12 and a clustering unit 14), a storage unit 15 (a cluster information storage unit 16, a frequency storage unit 26, and an occupation degree storage unit 28. , Object importance level information storage unit 34a and image importance level information storage unit 34b), object appearance feature amount calculation unit 18 (including frequency calculation unit 20 and occupancy level calculation unit 22), evaluation unit 29 (object importance level calculation) A display control unit 36, and a display control unit 36.
- the image acquisition unit 10 includes an SD card reader, and acquires image data from the SD memory card 4 inserted in the SD card slot.
- the object feature amount extraction unit 12 extracts the image feature amount of the object appearing in the image for the image data acquired by the image acquisition unit 10 and outputs it as the object feature amount.
- a method for extracting the image feature quantity related to the face from the image it may be calculated by an extraction method using a Gabor filter.
- a Gabor filter For the Gabor method, see Reference 1 described later.
- the clustering 14 performs clustering based on the object feature amount output from the object feature amount extraction unit 12 and stores the cluster information indicating the result in the cluster information storage unit 16.
- a clustering method a k-means method (see Reference 1), which is one of non-hierarchical methods (a method of performing clustering by giving a representative to each fixed cluster), can be used.
- the storage unit 15 includes a cluster information storage unit 16, a frequency storage unit 26, an occupancy level storage unit 28, an object importance level information storage unit 34a, and an image importance level information storage unit 34b.
- storage part 15 can be comprised from RAM, for example.
- the object appearance feature amount calculation unit 18 calculates an object appearance feature amount based on the cluster information stored in the cluster information storage unit 16.
- the “object appearance feature amount” is an amount indicating how objects belonging to the same cluster appear in the image. Specifically, (1) “Frequency” indicating how many objects belonging to the same cluster appear in the total number of images, and (2) “Occupancy” indicating the ratio of objects belonging to any cluster to each image. "Degree”, composed of these two types.
- the frequency and the occupancy are calculated by the frequency calculator 20 and the occupancy calculator 22 included in the object appearance feature amount calculator 18. The calculated frequency is stored in the frequency storage unit 26, and the calculated occupancy is stored in the occupancy storage unit 28.
- the object importance calculation unit 30 is an object that indicates the importance of each object appearing in each image in all image data based on the object appearance feature (frequency and occupancy) stored in the object appearance feature storage. Calculate importance.
- the object importance level information storage unit 34a is constituted by, for example, a RAM, and stores object importance level information including the object importance level calculated by the object importance level calculation unit 30.
- the image importance calculation unit 32 calculates the importance (image importance) of each image based on the object importance.
- the image importance level information storage unit 34b is composed of, for example, a RAM, and stores image importance level information including the image importance level calculated by the image importance level calculation unit 32.
- the display control unit 36 displays the calculated image importance on the screen of the display 6. ⁇ Operation> Next, a flow until an image is evaluated will be described.
- processing is performed in the order of image acquisition / clustering (S11), object appearance feature amount calculation (S12), and image evaluation (S13).
- the image acquisition unit 10 acquires the image data stored in the SD memory card 4 from the SD memory card 4 (S21).
- the SD memory card 4 stores image data for three images A to C, and the image acquisition unit 10 acquires the image data of the images A to C. To do.
- the object feature quantity extraction unit 12 cuts out a face area from the image for the acquired image data, and extracts the face feature quantity as an object feature quantity (S22). Then, the clustering unit 14 performs clustering using the extracted object feature amount, and stores the cluster information, which is the clustered result, in the cluster information storage unit 16 (S23).
- the object feature quantity extraction unit 12 extracts four faces F1 to F4 from three images A to C (FIG. 4A). Cut out and extract feature quantities of faces F1 to F4 (FIG. 4B) Then, the clustering unit 14 clusters the similar faces F1 and F2 among the four faces F1 to F4 as the person a and the faces F3 and F4 as the person b.
- FIG. 5 is a diagram showing a data structure of cluster information indicating the result of clustering.
- the cluster information is information indicating which cluster each face appearing in the image belongs to, and includes items of “cluster name” 17a, “face” 17b, and “image” 17c.
- the object appearance feature amount calculation unit 18 acquires cluster information from the cluster information storage unit 16 (S31).
- the frequency calculation unit 20 calculates the frequency by counting the number of faces belonging to the same cluster ("cluster name" 17a is the same) based on the acquired cluster information (S32). Then, frequency information including the calculated frequency is stored in the frequency storage unit 26.
- Fig. 7 shows the data structure of frequency information.
- the frequency information includes items of “cluster name” 27a, “image” 27b, and “frequency” 27c.
- the value of “frequency” 27c is obtained by counting the number of images of “image” 27b.
- the frequency of the person a is 2, and the frequency of the person b is 2.
- the frequency is the number of faces belonging to the same cluster, it can be said that the number of members of the cluster.
- Fig. 8 shows the data structure of occupancy information.
- the occupancy information includes items of “image” 29a, “occupation degree of face belonging to person a” 29b, “occupation degree of face belonging to person b” 29c, and “occupation degree of background” 29d.
- the “occupation degree of the face belonging to the person a” 29b is an occupation degree of the face (object) whose cluster is the person a in each image. For example, in the image A, the face F1 belongs to the cluster of the person a (see FIG. 5), and the size of the face area is 30%, so the “occupation degree of the face belonging to the person a” is 0.3. .
- the object importance calculation unit 30 sets one image to be evaluated (S41). Then, the object importance calculation unit 30 acquires frequency information indicating the frequency of the object shown in the set image from the frequency storage unit 26 and acquires occupancy information indicating the occupancy level of the object from the occupancy storage unit 28. (S42).
- the object importance calculation unit 30 calculates the object importance by integrating the acquired frequency and occupancy.
- steps S42 to S44 are steps for calculating object importance levels for all objects appearing in the evaluation target image. For example, if there are three objects appearing in the evaluation target image, the object importance degree is calculated for each of the three objects. The calculated object importance level is stored in the object importance level information storage unit 34a.
- the image importance calculation unit 32 calculates the image importance by adding the object importance of all the objects to be evaluated, and the image importance The degree is stored in the image importance degree information storage unit 34b as image importance degree information (S45).
- step S46: No If there is an image whose image importance level has not been calculated (S46: No), the process returns to step S41 to set the uncalculated image as an evaluation target and obtain the image importance level.
- the display control unit 36 uses the evaluation information stored in the image importance information storage unit 34b to display the image importance on the screen of the display 6. Is displayed (S47).
- a general method can be used as a display method of the image importance, and the score (score) of each image may be displayed or the ranking of the score may be displayed. Even if the score and ranking are not directly displayed, the higher score image is preferentially displayed (for example, a larger size is displayed or a slide show is displayed more frequently).
- Image A is an image in which the object of the face F1 (belonging to the cluster of the person a) is shown (see FIG. 5), the frequency of the person a to which the face F1 belongs is 2 (see FIG. 7), and the occupation of the face F1 The degree is 0.3 (see FIG. 8).
- the image importance calculation unit 32 sets the importance 0.6 of the face F1 as the image importance of the image A as it is.
- Image B is an image in which the object of face F2 (belonging to the cluster of person a) and the object of face F3 (belonging to the cluster of person b) are shown (see FIG. 5).
- the frequency of the person a to which the face F2 belongs is 2, and the frequency of the person b to which the face F3 belongs is also 2 (see FIG. 7).
- the occupation degree of the face F2 is 0.4, and the occupation degree of the face F3 is 0.3 (see FIG. 8).
- Image C is an image in which an object of face F4 (belonging to the cluster of person b) is shown.
- FIG. 11 shows image importance information corresponding to images A to C.
- the objects shown in the image are clustered, the frequency of the objects belonging to the same cluster (person a, person b), the occupancy of the objects included in the individual images, and
- the image importance can be calculated using the image importance, and the image importance can be increased in proportion to the frequency or the occupation.
- the object importance of the three objects can be reflected in the high image importance. For example, in the case where an image held by a family is to be evaluated, a person making up the family is evaluated as an important object, and a photograph (family group photo) in which this important object is gathered is particularly high. Can be evaluated. Therefore, it can contribute to the evaluation of the image according to the user's intention.
- an image showing an object such as a person constituting a family or a pet often photographed by a user can be evaluated as an important photograph for the user. There is an effect of facilitating selection of an image.
- the object importance is obtained by integrating the frequency and the occupancy (FIG. 9: S43), but is not limited thereto.
- F I Log (F R * 100 + 1) / log (F N +1) * F F (Formula 1) You may obtain
- F N is the number of objects (persons) caught on the image
- F R is occupancy object.
- the evaluation unit 30 calculates the image importance by adding the object importance of each object. In short, the higher the number of objects, the higher the score. If it is a method, it will not be restricted to this.
- the object importance of each object may be integrated. At that time, each term is less than 1 so that the value is not reduced by integration. For example, the percentage is used instead of the occupancy.
- FIG. 13 is a functional block diagram of the image evaluation system 101 according to the second embodiment.
- the graph generation unit 40 of the image evaluation apparatus 102 includes an image / subject node generation unit 44a1 and a dummy node generation unit 44a2 that generate nodes, a subject link setting unit 44b1 that sets a link value indicating a connection relationship between nodes, and a dummy.
- a link setting unit 44b2 is provided.
- the image / subject node generation unit 44a1 generates a node indicating an image and a node indicating an object based on the cluster information (see FIG. 5) acquired from the cluster information storage unit 16.
- the dummy node generation unit 44a2 generates a dummy node (background node).
- the subject link setting unit 44b1 sets a link between the image node and the object node based on the occupancy acquired from the occupancy storage unit 28 and the cluster information acquired from the cluster information storage unit 16. Since this link is oriented, the graph is of a type called a directed graph.
- the dummy link setting unit 44b2 similarly sets a link between the image node and the dummy node.
- the graph storage unit 46 stores the graph generated by the graph generation unit 40.
- the evaluation unit 48 includes an adjacency matrix generation unit 50, an eigenvector calculation unit 52, and an importance calculation unit 53.
- the adjacency matrix generation unit 50 generates an adjacency matrix that represents the graph stored in the graph storage unit 46.
- the eigenvector calculation unit 52 calculates the eigenvector of the generated adjacency matrix.
- the importance calculation unit 53 calculates the object importance of each object and the image importance of each image based on the calculated eigenvector component.
- the calculated object importance level is stored in the object importance level information storage unit 34a. Further, the calculated image importance level is stored in the image importance level information storage unit 34b. ⁇ Operation> Next, a flow until an image is evaluated will be described.
- the processing is performed in the order of image acquisition / clustering (S11), occupancy calculation / graph generation (S51), adjacency matrix generation / eigenvector calculation (S52).
- the occupancy calculation unit 22 acquires cluster information from the cluster information storage unit 16 (S31), and which person occupies and how much in each image. Is calculated (S33). Since this calculation method is the same as the method described in the first embodiment (see FIG. 6), the same step number is assigned and description thereof is omitted.
- the image / subject node generation unit 44a1 and the dummy node generation unit 44a2 of the graph generation unit 40 acquire the cluster information of the cluster information storage unit 16 and the occupancy of the occupancy storage unit 28 (S61).
- the image / subject node generation unit 44a1 includes P nodes corresponding to images (the number is P) included in the cluster information and the number of types of clusters (the number is Q). Q nodes corresponding to are generated.
- the dummy node generation unit 44a2 generates one node corresponding to the background (set one abstract dummy Z for all images). The number of generated nodes is (1 + P + Q) (S62).
- the number of images in the “image” 27b and the type of cluster in the “cluster name” 27a calculated by the frequency calculation unit 20 (FIG. 7) can be used.
- the subject link setting unit 44b1 and the dummy link setting unit 44b2 of the graph generation unit 40 set a link between the created nodes (S63).
- the basic rules for setting a link are as follows.
- the value of the link from each image to each cluster is the occupancy of objects belonging to that cluster.
- the value of the link from the dummy Z to each cluster and each image is a value obtained by dividing 1 by the number of linked nodes (1 is evenly distributed).
- FIG. 16 shows an example in which a link is set between the image A, the person a, and the dummy Z.
- the value of the link from the image A to the person a becomes the occupancy ⁇ (a) of the person a in the image A.
- the value of the link from the person a to the image A is a fixed value ⁇ .
- the value of the link from image A to dummy Z (F (A ⁇ Z)) is the background occupancy (1- ⁇ (a)) in image A. Further, the value of the link from the person a to the dummy Z (F (a ⁇ Z)) is 1 ⁇ .
- the value of the link from the dummy Z to the image A (F (Z ⁇ A)) and the value of the link from the dummy Z to the person a (F (Z ⁇ a)) are 0.5 obtained by dividing 1 by the number of nodes 2. .
- FIG. 17 is a diagram for explaining the link setting according to such a link rule.
- the link from the image A to the person a is 0.3, which is the degree of occupation of the person a.
- the link from the person a to the image A has a fixed value ⁇ .
- ⁇ 0.1 here, it is 0.1.
- the link from image A to dummy Z has a background occupancy of 0.7 in image A.
- the value of the link from the dummy Z to the other four nodes is 0.25 obtained by dividing 1 by the number 4 of link destination nodes.
- the value of the link between other nodes is set according to the above rule.
- the graph generation unit 40 After completing the link setting, the graph generation unit 40 outputs a completed graph by determining the link between the nodes (S64), and the graph storage unit 46 converts the output graph into a table structure.
- the graph of FIG. 17 is stored as a table structure as shown in FIG.
- the adjacency matrix generation unit 50 generates an adjacency matrix M using the table structure stored in the graph storage unit 46 (S71). If the table structure of FIG. 18 is used, an adjacency matrix M that is a fifth-order (5 rows and 5 columns) square matrix as shown in FIG. 20 is generated.
- Steps S72 to S74 are operations of “power method”, which is a kind of method for simultaneously obtaining a main eigenvalue and a main eigenvector belonging to the main eigenvalue.
- the eigenvector calculation unit 52 determines the converged eigenvector P as the main eigenvector P (S75), and the importance calculation unit 53 extracts an image component and an object component from the main eigenvector components (S76).
- the importance calculation unit 53 normalizes both components (so that the sum is 1), and outputs it to the image importance information storage unit 34b (S77).
- FIG. 21 is a diagram for explaining the flow of obtaining the image importance and object importance of an image from the components of the main eigenvector P.
- Image importance and the importance of the persons a and b can be calculated as relative evaluations.
- the recursive eigenvector calculation processing of the adjacency matrix M representing the link relationship can propagate the importance between objects, and even an object that is infrequent or not large is shared with an important object. Waking up can raise the importance.
- the frequency of three types of clusters of a person a (son), a person d (son's friend), and a person e (grandmother) is: person a> person e> person It is assumed that the frequency of the person a is very high (meaning that the importance level of the person a is high) (FIG. 22A).
- the importance of the person e is higher than that of the person a. Can raise its importance. Therefore, although the frequency of the person e is lower than that of the person d, the image importance of the image E in which the person e is captured alone exceeds the image importance of the image F in which the person d is captured alone. Can be.
- the importance of the image and the object is calculated using the data structure between the image and the object included in the image (photographed). Any structure may be used as long as it defines a set, a set related to an image, and a relationship between the sets.
- the set of images may be an object included in the image as in the second embodiment, but may be various types of information embedded in the image as in the Exif (Exchangeable image file format) standard, for example. Furthermore, the tag provided to the image may be sufficient.
- Embodiment 3 a data structure between an image and a tag attached to the image will be described as an example of such a data structure.
- FIG. 25 shows the relationship between an image (image T, image U) and a tag (tag a, tag b).
- the image T has a “golf” tag a
- the image U has a “golf” tag a and a “birdy” tag b.
- tag assignment is manually set by the user, for example.
- FIG. 25 shows a link indicating the relationship between the images T and U and the tags a and b.
- the link value from this image to the tag is obtained by multiplying the freshness of the image by 1/10.
- the freshness of the image is obtained from the difference between the current date and time (date and time of evaluation) and the date and time when the image was added.
- the weighting between the image T and the tags a and b is increased.
- the image freshness is a value (0 to 1) obtained by normalizing the addition date and time of each image.
- Freshness of image T (addition date / time of image T ⁇ oldest addition date / time in image) / (current date / time ⁇ oldest addition date / time in image) It calculates by calculating
- This link is assigned a fixed value of 0.01, for example.
- the value of the link from the tag “golf” to the image T is 0.01.
- the relationship between the images T and U and the tags a and b and the images T and U and the tags a and b are obtained, and for example, by using the adjacency matrix as in the second embodiment, The image importance of the images T and U and the tag importance of the tags a and b can be calculated.
- weights may be returned to the dummy nodes as shown in FIG. ⁇ Supplement>
- this invention is not limited to said content, It can implement also in the various forms for achieving the objective of this invention, its related or incidental object, for example, The following may also be used.
- the k-means method which is a non-hierarchical method, is taken as an example of the clustering method performed by the clustering 14; however, the present invention is not limited to this, and the Ward method (Ward's method) or the like can be used. A hierarchical approach may be used.
- a human face is described as an example.
- the object is not limited to a person, and an animal face such as a dog or a cat may be used as an object of image feature extraction. Absent.
- the object shown in the image is not limited to the face, and various objects such as a car, a plant, and a building are conceivable.
- the eigenvector calculation unit 52 calculates the eigenvector using the power method, but the present invention is not limited to this. For example, an inverse iteration method may be used.
- the fixed value of the link to the image may be individually set such that the link from the person a to the image A is the fixed value ⁇ 1, and the link from the dummy Z to the image A is the fixed value ⁇ 2.
- the link value to images A and B is calculated by using a method for dealing with the problem of a node that does not have an outward link and absorbs importance (problem of ranksinks). It may be determined.
- the object importance is described based on (A) the frequency of the cluster to which the object belongs and (B) the occupancy of the object.
- the former (A) may be an amount that characterizes the cluster to which the object belongs, and is not limited to the frequency (number of members of the cluster).
- the degree of smile of the person for example, in addition to the degree of occupancy, the degree of smile of the person, the degree of face orientation, and the degree of focus of the person may be weighted.
- the importance of each object (F1, F6) may be calculated using a coefficient indicating the degree of smile.
- the image S having a greater smile level has a higher image importance than the image A.
- the calculation formula can be such that the more important the image is, the higher the importance of the image is.
- a person's degree of focus is evaluated higher as the person's focus is sharper, and lower as the person is out of focus.
- the technique of the following Reference 3 can be used.
- each object may be calculated based on the degree of smile, the direction of the face, the degree of focus of the person, etc. without using the occupancy at all.
- any value can be used as long as it is a value indicating the feature of the object in each image, and is not limited to the degree of occupation, the degree of smile, the degree of face orientation, and the like.
- the image evaluation device 2 has been described as including the object feature amount extraction unit 12 and the clustering unit 14, but these are in an external device of the image evaluation device 2. It doesn't matter.
- the image evaluation system 111 shown in FIG. 24 includes an image evaluation device 112, an SD memory card 4, and a clustering device 114.
- the clustering device 114 includes an object feature amount extraction unit 12 that extracts an object feature amount based on image data acquired from the SD memory card 4, a clustering unit 14, and a cluster storage unit 16.
- the image evaluation apparatus 112 includes a cluster information acquisition unit 60 that acquires cluster information of the cluster storage unit 16.
- connection between the two can take various forms regardless of wired (LAN cable, USB cable, etc.) or wireless (infrared, Bluetooth, etc.).
- the image evaluation apparatus of each embodiment may be typically realized as an LSI (Large Scale Integration) that is an integrated circuit.
- LSI Large Scale Integration
- Each circuit may be individually configured as one chip, or may be integrated into one chip so as to include all or some of the circuits.
- IC Integrated Circuit
- system LSI system LSI
- super LSI super LSI
- ultra LSI depending on the degree of integration.
- the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- An FPGA Field Programmable Gate Array
- reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- Such recording media include smart media, compact flash (registered trademark), memory stick (registered trademark), SD memory card, multimedia card, CD-R / RW, DVD ⁇ R / RW, DVD-RAM, HD- There are DVD, BD (Blu-ray Disc) and so on.
- the distributed and distributed control program is used by being stored in a memory or the like that can be read by the processor, and the processor executes the control program to perform various functions as shown in the embodiment. It will be realized.
- the display 6 at the bottom of FIG. 9 displays the ranking of images with high image importance.
- the present invention is not limited to this, and the ranking of persons with high object importance may be displayed. .
- the screen 270 is a general touch screen screen, and an arrow and an icon can be selected by touching the finger 272.
- the scale bar 274 is marked with “year”, “month”, “week”, and “day” scales.
- the user drags the round button 276 left and right to set the scale, and taps the left and right arrows 278L and 278R to specify the date and time range.
- 2010 is specified as the date / time range.
- the object extraction unit 11 has a shooting date and time of 2010 among images acquired by the image acquisition unit 10 (for example, 2010/01/01/00: 00 to 2010/12/31/23: 59).
- the evaluation unit 48 calculates the importance corresponding to the image whose shooting date is 2010.
- the display control unit 36 displays the calculated importance.
- a method of calculating the corresponding importance each time may be used, but instead, a method of calculating and storing the importance for each date / time range that can be designated in advance. It doesn't matter.
- the designation of the date and time range is not limited to the touch input as shown in FIG. 27, but may be accepted by numerical input from the keyboard.
- the value of the link from the cluster such as a person to the image in which the person is shown is a fixed value, but it may be set according to the degree of occupancy. .
- FIG. 29 is a diagram for explaining the link setting according to such a link rule.
- Reference 1 "Face recognition using weighted matching based on Gabor feature information" Kazuhiro Hotta (Saitama University (Japan Society for the Promotion of Science) Research Fellow) IEICE technical report.
- an image important for a user can be efficiently searched from a huge amount of image content held by a plurality of users such as family members. Compared with the evaluation method, the user can easily browse and search the content without burden.
- the user can easily select the desired image from the search results based on the ranking of images when searching for a photo that represents the family by creating a New Year's card from the enormous amount of photo content collected by the family. It is useful for stationary terminals such as personal computers and server terminals. It is also useful as a mobile terminal such as a digital camera or a mobile phone.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Quality & Reliability (AREA)
- Health & Medical Sciences (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
以下、本実施の形態について、図面を参照しながら説明する。
<構成>
図1に示すように、画像評価システム1は、画像評価装置2とSDメモリカード4とを含んで構成される。
<動作>
次に、画像を評価するまでの流れについて説明する。
そして、クラスタリング部14は、4つの顔F1~F4のうち、類似している顔どうしの顔F1,F2を人物aとし、また顔F3,F4を人物bとクラスタリングする。
画像Aは、顔F1(人物aのクラスタに属する。)のオブジェクトが写っている画像であり(図5参照)、顔F1が属する人物aの頻度は2(図7参照)、顔F1の占有度は0.3である(図8参照)。
画像Bは、顔F2(人物aのクラスタに属する。)のオブジェクトと、顔F3(人物bのクラスタに属する。)のオブジェクトが写っている画像である(図5参照)。顔F2が属する人物aの頻度は2、顔F3が属する人物bの頻度も2である(図7参照)。また、顔F2の占有度は0.4、顔F3の占有度は0.3である(図8参照)。
画像Cは、顔F4(人物bのクラスタに属する。)のオブジェクトが写っている画像である。オブジェクト重要度算出部30は、顔F4の頻度と占有度からオブジェクト重要度0.6(=2×0.3)を算出し、画像重要度算出部32は、重要度0.6をそのまま画像Cの画像重要度とする。
FI=Log(FR*100+1)/log(FN+1)*FF・・・(式1)
の式から求めてもよい。ここでFNは画像に写るオブジェクト(人物)の数、FRはオブジェクトの占有度である。
(実施の形態2)
実施の形態2では、画像に写っているオブジェクトとそのオブジェクトが属するクラスタの結び付きに基づいて画像を評価する。また、共起関係にあるオブジェクト(ひとつの画像に共に写っている複数のオブジェクト)どうしで重要度を伝搬させる。これにより、頻度が少なかったり大きく写っていないオブジェクトでも、他の重要なオブジェクトと共起していれば、その重要度を引き上げことができる。
<構成>
図13は、実施の形態2に係る画像評価システム101の機能ブロック図である。
<動作>
次に、画像を評価するまでの流れについて説明する。
MP=λP・・・(式2)
の式によって隣接行列Mにおいて、固有値λとして成立する固有ベクトルPを求める(S72)。この固有ベクトルPが収束していない場合には(S73)、算出した固有ベクトルPに元の行列を積算し(S74)、行列の固有ベクトルPが収束(convergent)する(S73:Yes)まで再帰的に計算処理を行う。
(実施の形態3)
実施の形態2では、画像と画像に含まれる(写っている)オブジェクトとの間のデータ構造を利用して、画像およびオブジェクトの重要度を算出したが、要は、データ構造としては、画像の集合と、画像に係る集合と、両集合間の関係付けを規定するような構造であればよい。
画像Tの鮮度=(画像Tの追加日時-画像の中で最も古い追加日時)/(現在の日時-画像の中で最も古い追加日時)
の式を求めることにより算出する。なお、日時の差分は分単位換算で求める。
<補足>
以上、本発明の実施の形態について説明したが、本発明は上記の内容に限定されず、本発明の目的とそれに関連又は付随する目的を達成するための各種形態においても実施可能であり、例えば、以下であっても構わない。
各実施の形態では、クラスタリング14が行うクラスタリングの手法として、非階層的な手法であるk-means法を例に挙げたが、これに限らずウォード法(Ward's method)などの階層的な手法を用いてもよい。
各実施の形態では、オブジェクトとして人の顔を例に挙げて説明したが、人に限らず、犬や猫などの動物の顔をオブジェクトとして画像特徴量の抽出対象としても構わない。また、画像に写るオブジェクトとしては、顔に限らず、自動車や植物、建物など様々な物体などが考えられる。
実施の形態2では、固有ベクトル算出部52はべき乗法を用いて固有ベクトルを算出するとしたが、これに限られない。例えば、逆反復法を用いてもよい。
実施の形態2では、画像A,Bへのリンクは固定値β(=0.1)を用いるとして説明した(図17参照)が、固定値βの値は0.1に限らず、適宜設定することができる。また、人物aから画像Aへのリンクは固定値β1、ダミーZから画像Aへのリンクは固定値β2というように、画像へのリンクの固定値を個別に設定してもよい。
実施の形態2では、全画像の背景を1個のノード(ダミーZ)としてグラフ生成する例を説明したが、これに限らず、背景のノードを複数個作成したり、あるいは背景のノードを有さないグラフを作成しても構わない。
各実施の形態では、オブジェクトの重要度は、(A)オブジェクトが属するクラスタの頻度と(B)オブジェクトの占有度とに基づいて求める例を挙げて説明した。
各実施の形態では、画像評価装置2の内部にオブジェクト特徴量抽出部12やクラスタリング部14を備えるとして説明したが、これらは画像評価装置2の外部装置にあっても構わない。
各実施の形態では、SDメモリカード4内に評価対象の画像を記憶するとして説明したが、記録媒体であればこれに限られず、スマートメディア、コンパクトフラッシュ(登録商標)、メモリースティック(登録商標)、SDメモリーカード、マルチメディアカード、CD-R/RW、DVD±R/RW、DVD-RAM、HD-DVD、BD(Blu-ray Disc)の記録媒体などを用いてもよい。
各実施の形態の画像評価装置は、典型的には集積回路であるLSI(Large Scale Integration)として実現されてよい。各回路を個別に1チップとしてもよいし、全ての回路又は一部の回路を含むように1チップ化されてもよい。ここでは、LSIとして記載したが、集積度の違いにより、IC(Integrated Circuit)、システムLSI、スーパLSI、ウルトラLSIと呼称されることもある。また、集積回路化の手法はLSIに限るものではなく、専用回路又は汎用プロセッサで実現してもよい。LSI製造後にプログラム化することが可能なFPGA(FieldProgrammable Gate Array)、LSI内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサを利用してもよい。
各実施の形態で示した画像評価に係る処理(図2,3,6,9,14,15,19など参照)をコンピュータ等の各種機器のプロセッサ、及びそのプロセッサに接続された各種回路に実行させるためのプログラムコードからなる制御プログラムを、記録媒体に記録すること、又は各種通信路を介して流通させ頒布することもできる。
図9下部のディスプレイ6では、画像重要度の高い画像のランキングを表示するとしているが、これに限らずオブジェクト重要度の高い人物のランキングを表示するとしても構わない。
実施の形態2では、人物などのクラスタからその人物が写った画像へのリンクの値は固定値としたが、占有度に応じて設定するとしても構わない。
<参考文献>
(1)参考文献1
「Gabor特徴の情報量による重みづけマッチングを用いた顔認識」
堀田 一弘(埼玉大学(日本学術振興会特別研究員))
電子情報通信学会技術研究報告. HIP,
ヒューマン情報処理 100(34) pp.31-38 20000504
(2)参考文献2
The PageRank citation ranking: Bringing order to the Web.
Page, Lawrence; Brin, Sergey; Motwani, Rajeev and Winograd, Terry (1999).
(3)参考文献3
特開2006-172417号公報
2,102,112 画像評価装置
4 SDメモリカード
10 画像取得部
11 オブジェクト抽出部
12 オブジェクト特徴量抽出部
14 クラスタリング部
16 クラスタ情報記憶部
18 オブジェクト出現特徴量算出部
20 頻度算出部
22 占有度算出部
24 オブジェクト出現特徴量記憶部
26 頻度記憶部
28 占有度記憶部
30 オブジェクト重要度算出部
32 画像重要度算出部
34a オブジェクト重要度情報記憶部
34b 画像重要度情報記憶部
36 表示制御部
40 グラフ生成部
44a1 画像・被写体ノード生成部
44a2 ダミーノード生成部
44b1 被写体リンク設定部
44b2 ダミーリンク設定部
46 グラフ記憶部
48 評価部
50 隣接行列生成部
52 固有ベクトル算出部
53 重要度算出部
Claims (15)
- オブジェクトが含まれる複数の画像を対象として、各画像を評価する画像評価装置であって、前記オブジェクトには、それぞれオブジェクトが属するクラスタを示すクラスタ情報がそれぞれ関連付けられており、
一の画像に含まれるオブジェクトについて、当該オブジェクトが属するクラスタに基づき第1値を算出する第1値算出手段と、
前記一の画像に含まれるオブジェクトについて、当該オブジェクトが当該一の画像において現れる特徴を示す第2値を算出する第2値算出手段と、
算出された第1値および第2値に基づいて、前記一の画像に現れている1以上のオブジェクトの重要度と、前記一の画像の重要度とを算出する評価手段と
を備えることを特徴とする画像評価装置。 - 前記評価手段は、
算出された第1値および第2値に基づいて、前記一の画像に現れているオブジェクトの重要度を算出するオブジェクト重要度算出手段と、
算出されたオブジェクトの重要度に基づいて、前記一の画像の評価を示す重要度を算出する画像重要度算出手段とを含む
ことを特徴とする請求項1に記載の画像評価装置。 - 前記第1値算出手段は、前記複数の画像に含まれる全オブジェクトのうちで、一の画像に含まれるオブジェクトが属するクラスタのオブジェクト、が出現する頻度を算出し、
前記第2値算出手段は、前記一の画像に含まれるオブジェクトについて、
当該オブジェクトが前記一の画像において占める面積を示す占有度を算出し、
前記オブジェクト重要度算出部は、算出された頻度および占有度に基づいて、前記オブジェクトの重要度を算出する
ことを特徴とする請求項2に記載の画像評価装置。 - 前記一の画像に第1オブジェクトと第2オブジェクトとが含まれる場合、
前記頻度算出手段は、
前記第1オブジェクトが属するクラスタに含まれるオブジェクトの頻度を示す第1頻度と、
前記第2オブジェクトが属するクラスタに含まれるオブジェクトの頻度を示す第2頻度と、
を算出し、
前記占有度算出手段は、
第1オブジェクトが前記一の画像において占める面積を示す第1占有度と、
第2オブジェクトが前記一の画像において占める面積を示す第2占有度と、
を算出し、
前記オブジェクト重要度算出手段は、
前記第1頻度および前記第1占有度に基づいて第1オブジェクトの重要度である第1重要度を算出し、
前記第2頻度および前記第2占有度に基づいて第2オブジェクトの重要度である第2重要度を算出し、
前記画像重要度算出手段は、
前記第1重要度および第2重要度に基づいて、前記一の画像の評価を示す重要度を算出する
ことを特徴とする請求項3に記載の画像評価装置。 - 前記オブジェクトは、前記一の画像から切り出された人の顔である
ことを特徴とする請求項1に記載の画像評価装置。 - 前記一の画像に第1オブジェクトと第2オブジェクトとが現れている場合、
前記評価手段は、両オブジェクト間で重要度を伝播させつつ両オブジェクトの重要度を算出する
ことを特徴とする請求項1に記載の画像評価装置。 - 前記オブジェクトが含まれる複数の画像はP個(Pは自然数)であって、
前記第1値算出手段は、P個の画像に含まれるオブジェクトが属するクラスタの個数のQ個(Qは自然数)を算出し、
前記第2値算出手段は、前記P個の画像に現れているオブジェクトのそれぞれについて、当該オブジェクトが当該各画像において現れる特徴を示す第2値を算出し、
前記P個の画像をそれぞれ示すP個のノードと、前記P個の画像に含まれるオブジェクトが属するクラスタをそれぞれ示すQ個のノードとを含むノードを作成するノード作成手段と、
ある画像Aを示すノードAと、当該画像Aに含まれるオブジェクトが属するクラスタaを示すノードaについて、このノードA-a間のリンクの値を、クラスタaに属するオブジェクトの画像Aにおける第2値を用いて設定するリンク設定手段と、
前記ノード作成手段に作成されたノードとリンク設定手段により設定されたリンクの値とにより構成されたグラフを表現する隣接行列を生成する隣接行列生成手段を備え、
前記評価手段は、
生成された隣接行列の主固有ベクトルを算出する固有ベクトル算出手段と、
算出された主固有ベクトルの要素に基づいてN個の画像それぞれの評価を示す重要度を算出する重要度算出手段と、
を含むことを特徴とする請求項1に記載の画像評価装置。 - 前記リンク設定手段は、前記ノードA-a間のリンクの値のうち、ノードAからノードaへのリンクの値を前記第2値を用いて設定し、ノードaからノードAへのリンクの値は前記第2値を用いずに設定する
ことを特徴とする請求項7に記載の画像評価装置。 - 前記第2値算出手段が算出する第2値は、前記オブジェクトが当該各画像において占める面積を示す占有度である
ことを特徴とする請求項8に記載の画像評価装置。 - 前記第2値算出手段は、各画像においてオブジェクトが現れていない領域である背景の占有度を算出し、
前記ノード作成手段は、ダミーを示す1個のノードを作成し、
前記リンク設定手段は、
ある画像Aを示すノードAと、ダミーを示す1個のノードZとについて、このノードA-Z間のリンクの値のうちのノードAからノードZへのリンクの値を、画像Aにおける背景の占有度を用いて設定する
ことを特徴とする請求項8に記載の画像評価装置。 - 前記リンク設定手段は、前記ノードA-a間のリンクの値について、ノードAからノードaへのリンクの値およびノードaからノードAへのリンクの値を前記第2値を用いて設定する
ことを特徴とする請求項7に記載の画像評価装置。 - 前記複数の画像それぞれには撮影日時が関連付けられており、
さらに、日時の範囲の指定を受け付ける受付手段を備え、
前記第1値算出手段および第2値算出手段は、指定範囲に含まれる画像を用いて算出を行うが、指定範囲に含まれない画像を用いては算出を行わない
ことを特徴とする請求項1に記載の画像評価装置。 - それぞれオブジェクトを含む複数の画像を対象として、各画像を評価する画像評価方法であって、
各オブジェクトについて属するクラスタを示すクラスタ情報を取得する取得ステップと、
一の画像に含まれるオブジェクトについて、当該オブジェクトが属するクラスタに基づき第1値を算出する第1値算出ステップと、
前記一の画像に含まれるオブジェクトについて、当該オブジェクトが当該一の画像において現れる特徴を示す第2値を算出する第2値算出ステップと、
算出された第1値および第2値に基づいて、前記一の画像に現れている1以上のオブジェクトの重要度と、前記一の画像の重要度とを算出する評価ステップと
を含むことを特徴とする画像評価方法。 - それぞれオブジェクトを含む複数の画像を対象として、各画像を評価する画像評価処理をコンピュータに実行させるプログラムであって、
前記画像評価処理は、
各オブジェクトについて属するクラスタを示すクラスタ情報を取得する取得ステップと、
一の画像に含まれるオブジェクトについて、当該オブジェクトが属するクラスタに基づき第1値を算出する第1値算出ステップと、
前記一の画像に含まれるオブジェクトについて、当該オブジェクトが当該一の画像において現れる特徴を示す第2値を算出する第2値算出ステップと、
算出された第1値および第2値に基づいて、前記一の画像に現れている1以上のオブジェクトの重要度と、前記一の画像の重要度とを算出する評価ステップと
を含む処理であることを特徴とするプログラム。 - オブジェクトが含まれる複数の画像を対象として、各画像を評価する画像評価装置であって、前記オブジェクトには、それぞれオブジェクトが属するクラスタを示すクラスタ情報がそれぞれ関連付けられており、
一の画像に含まれるオブジェクトについて、当該オブジェクトが属するクラスタに基づき第1値を算出する第1値算出手段と、
前記一の画像に含まれるオブジェクトについて、当該オブジェクトが当該一の画像において現れる特徴を示す第2値を算出する第2値算出手段と、
算出された第1値および第2値に基づいて、前記一の画像に現れている1以上のオブジェクトの重要度と、前記一の画像の重要度とを算出する評価手段と
を備えることを特徴とする集積回路。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/255,324 US8660378B2 (en) | 2010-02-10 | 2010-12-16 | Image evaluating device for calculating an importance degree of an object and an image, and an image evaluating method, program, and integrated circuit for performing the same |
JP2011514937A JP4846071B2 (ja) | 2010-02-10 | 2010-12-16 | 画像評価装置、画像評価方法、プログラム、集積回路 |
CN201080014545.8A CN102439630B (zh) | 2010-02-10 | 2010-12-16 | 图像评价装置、图像评价方法、程序、集成电路 |
EP10845711.0A EP2535865A4 (en) | 2010-02-10 | 2010-12-16 | Image evaluating device, image evaluating method, program, and integrated circuit |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-027756 | 2010-02-10 | ||
JP2010027756 | 2010-02-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011099108A1 true WO2011099108A1 (ja) | 2011-08-18 |
Family
ID=44367419
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/007300 WO2011099108A1 (ja) | 2010-02-10 | 2010-12-16 | 画像評価装置、画像評価方法、プログラム、集積回路 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8660378B2 (ja) |
EP (1) | EP2535865A4 (ja) |
JP (2) | JP4846071B2 (ja) |
CN (1) | CN102439630B (ja) |
WO (1) | WO2011099108A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013110714A (ja) * | 2011-11-24 | 2013-06-06 | Canon Marketing Japan Inc | 撮影装置、その制御方法、及びプログラム |
JP2018170001A (ja) * | 2017-03-29 | 2018-11-01 | 西日本電信電話株式会社 | 映像データ処理装置、映像データ処理方法、及びコンピュータプログラム |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8606776B2 (en) * | 2011-02-18 | 2013-12-10 | Google Inc. | Affinity based ranked for search and display |
KR20120118383A (ko) * | 2011-04-18 | 2012-10-26 | 삼성전자주식회사 | 이미지 보정 장치 및 이를 이용하는 이미지 처리 장치와 그 방법들 |
US8832080B2 (en) | 2011-05-25 | 2014-09-09 | Hewlett-Packard Development Company, L.P. | System and method for determining dynamic relations from images |
CN103620648B (zh) * | 2012-05-29 | 2018-02-02 | 松下知识产权经营株式会社 | 图像评价装置、图像评价方法、集成电路 |
US8861804B1 (en) | 2012-06-15 | 2014-10-14 | Shutterfly, Inc. | Assisted photo-tagging with facial recognition models |
WO2014091667A1 (ja) * | 2012-12-10 | 2014-06-19 | 日本電気株式会社 | 解析制御システム |
JP2014139734A (ja) * | 2013-01-21 | 2014-07-31 | Sony Corp | 情報処理装置および方法、並びにプログラム |
US9767347B2 (en) * | 2013-02-05 | 2017-09-19 | Nec Corporation | Analysis processing system |
JP5802255B2 (ja) * | 2013-03-13 | 2015-10-28 | 富士フイルム株式会社 | レイアウト編集装置、レイアウト編集方法およびプログラム |
JP6468725B2 (ja) * | 2013-08-05 | 2019-02-13 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びコンピュータプログラム |
JP6334697B2 (ja) * | 2013-11-08 | 2018-05-30 | グーグル エルエルシー | ディスプレイコンテンツのイメージを抽出し、生成するシステムおよび方法 |
US10013639B1 (en) * | 2013-12-16 | 2018-07-03 | Amazon Technologies, Inc. | Analyzing digital images based on criteria |
US9177194B2 (en) * | 2014-01-29 | 2015-11-03 | Sony Corporation | System and method for visually distinguishing faces in a digital image |
JP6367168B2 (ja) * | 2015-09-18 | 2018-08-01 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP6440604B2 (ja) | 2015-09-29 | 2018-12-19 | 富士フイルム株式会社 | 被写体評価システム,被写体評価方法,被写体評価プログラムおよびそのプログラムを格納した記録媒体 |
JP6742186B2 (ja) * | 2016-07-28 | 2020-08-19 | ヤフー株式会社 | 決定装置、決定方法、及び決定プログラム |
JP7000200B2 (ja) * | 2018-02-22 | 2022-01-19 | 株式会社野村総合研究所 | 広告効果予測システム、方法およびプログラム |
US11144998B2 (en) * | 2018-09-20 | 2021-10-12 | The Toronto-Dominion Bank | Dynamic provisioning of data exchanges based on detected relationships within processed image data |
KR102179591B1 (ko) * | 2018-12-12 | 2020-11-17 | 인하대학교 산학협력단 | 동영상 내 인물 영역 추출 장치 |
KR102179590B1 (ko) * | 2018-12-12 | 2020-11-17 | 인하대학교 산학협력단 | 동영상 내 등장인물 갈등정보 추출 장치 |
CN112527858B (zh) * | 2020-11-26 | 2024-06-25 | 微梦创科网络科技(中国)有限公司 | 基于社交内容的营销账号识别方法、装置、介质和设备 |
US11822596B2 (en) * | 2021-04-30 | 2023-11-21 | Hewlett-Packard Development Company, L.P. | Digital image file metadata characteristics |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004361989A (ja) | 2003-05-30 | 2004-12-24 | Seiko Epson Corp | 画像選択システム及び画像選択プログラム、並びに画像選択方法 |
JP2006172417A (ja) | 2004-11-16 | 2006-06-29 | Seiko Epson Corp | 画像評価方法、画像評価装置、及び印刷装置 |
JP2007049387A (ja) * | 2005-08-09 | 2007-02-22 | Canon Inc | 画像出力装置及び画像出力方法 |
JP2007080014A (ja) | 2005-09-15 | 2007-03-29 | Fujifilm Corp | 画像評価装置および方法並びにプログラム |
JP2007135065A (ja) | 2005-11-11 | 2007-05-31 | Konica Minolta Photo Imaging Inc | 表示プログラム及び画像データの表示方法 |
JP2008084047A (ja) * | 2006-09-28 | 2008-04-10 | Fujifilm Corp | 画像評価装置および方法並びにプログラム |
JP2009104003A (ja) * | 2007-10-24 | 2009-05-14 | Internatl Business Mach Corp <Ibm> | 教材選択システムの方法とプログラム |
JP2009533725A (ja) * | 2006-04-07 | 2009-09-17 | イーストマン コダック カンパニー | 画像コレクション間の接続の形成 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6748097B1 (en) * | 2000-06-23 | 2004-06-08 | Eastman Kodak Company | Method for varying the number, size, and magnification of photographic prints based on image emphasis and appeal |
JP4197858B2 (ja) * | 2001-08-27 | 2008-12-17 | 富士通株式会社 | 画像処理プログラム |
US7293007B2 (en) | 2004-04-29 | 2007-11-06 | Microsoft Corporation | Method and system for identifying image relatedness using link and page layout analysis |
US8031914B2 (en) * | 2006-10-11 | 2011-10-04 | Hewlett-Packard Development Company, L.P. | Face-based image clustering |
FR2923307B1 (fr) * | 2007-11-02 | 2012-11-16 | Eastman Kodak Co | Procede d'organisation de donnees multimedia |
JP2009217724A (ja) * | 2008-03-12 | 2009-09-24 | Panasonic Corp | 関連文書推定装置、関連文書推定方法及びプログラム、並びに記録媒体 |
US9015597B2 (en) * | 2009-07-31 | 2015-04-21 | At&T Intellectual Property I, L.P. | Generation and implementation of a social utility grid |
US8208696B2 (en) * | 2009-12-15 | 2012-06-26 | Hewlett-Packard Development Company, L.P. | Relation tree |
US20120050789A1 (en) * | 2010-08-31 | 2012-03-01 | Apple Inc. | Dynamically Generated Digital Photo Collections |
-
2010
- 2010-12-16 EP EP10845711.0A patent/EP2535865A4/en not_active Withdrawn
- 2010-12-16 JP JP2011514937A patent/JP4846071B2/ja active Active
- 2010-12-16 US US13/255,324 patent/US8660378B2/en active Active
- 2010-12-16 WO PCT/JP2010/007300 patent/WO2011099108A1/ja active Application Filing
- 2010-12-16 CN CN201080014545.8A patent/CN102439630B/zh active Active
-
2011
- 2011-10-07 JP JP2011222806A patent/JP5307873B2/ja active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004361989A (ja) | 2003-05-30 | 2004-12-24 | Seiko Epson Corp | 画像選択システム及び画像選択プログラム、並びに画像選択方法 |
JP2006172417A (ja) | 2004-11-16 | 2006-06-29 | Seiko Epson Corp | 画像評価方法、画像評価装置、及び印刷装置 |
JP2007049387A (ja) * | 2005-08-09 | 2007-02-22 | Canon Inc | 画像出力装置及び画像出力方法 |
JP2007080014A (ja) | 2005-09-15 | 2007-03-29 | Fujifilm Corp | 画像評価装置および方法並びにプログラム |
JP2007135065A (ja) | 2005-11-11 | 2007-05-31 | Konica Minolta Photo Imaging Inc | 表示プログラム及び画像データの表示方法 |
JP2009533725A (ja) * | 2006-04-07 | 2009-09-17 | イーストマン コダック カンパニー | 画像コレクション間の接続の形成 |
JP2008084047A (ja) * | 2006-09-28 | 2008-04-10 | Fujifilm Corp | 画像評価装置および方法並びにプログラム |
JP2009104003A (ja) * | 2007-10-24 | 2009-05-14 | Internatl Business Mach Corp <Ibm> | 教材選択システムの方法とプログラム |
Non-Patent Citations (3)
Title |
---|
KAZUHIRO HOTTA, JSPS RESEARCH FELLOW |
See also references of EP2535865A4 |
TECHNICAL COMMITTEE ON HUMAN INFORMATION PROCESSING, vol. 100, no. 34, pages 31 - 38 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013110714A (ja) * | 2011-11-24 | 2013-06-06 | Canon Marketing Japan Inc | 撮影装置、その制御方法、及びプログラム |
JP2018170001A (ja) * | 2017-03-29 | 2018-11-01 | 西日本電信電話株式会社 | 映像データ処理装置、映像データ処理方法、及びコンピュータプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP2535865A1 (en) | 2012-12-19 |
JP4846071B2 (ja) | 2011-12-28 |
JP2012053889A (ja) | 2012-03-15 |
CN102439630A (zh) | 2012-05-02 |
US20110317928A1 (en) | 2011-12-29 |
JPWO2011099108A1 (ja) | 2013-06-13 |
US8660378B2 (en) | 2014-02-25 |
JP5307873B2 (ja) | 2013-10-02 |
CN102439630B (zh) | 2015-05-20 |
EP2535865A4 (en) | 2017-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5307873B2 (ja) | 画像評価装置、画像評価方法、プログラム、集積回路 | |
JP5727476B2 (ja) | 画像評価装置、画像評価方法、プログラム、集積回路 | |
US20220116347A1 (en) | Location resolution of social media posts | |
US11019017B2 (en) | Social media influence of geographic locations | |
CN109740152B (zh) | 文本类目的确定方法、装置、存储介质和计算机设备 | |
JP6323465B2 (ja) | アルバム作成プログラム、アルバム作成方法およびアルバム作成装置 | |
JP6041217B2 (ja) | 画像評価装置、画像評価方法、プログラム、集積回路 | |
CN106326391A (zh) | 多媒体资源推荐方法及装置 | |
JP2014093058A (ja) | 画像管理装置、画像管理方法、プログラム及び集積回路 | |
JPWO2012169112A1 (ja) | コンテンツ処理装置、コンテンツ処理方法、プログラム、及び集積回路 | |
Yamasaki et al. | Social popularity score: Predicting numbers of views, comments, and favorites of social photos using only annotations | |
JP2014092955A (ja) | 類似コンテンツ検索処理装置、類似コンテンツ検索処理方法、およびプログラム | |
Vonikakis et al. | A probabilistic approach to people-centric photo selection and sequencing | |
JP2008234338A (ja) | 旬度解析システム、旬度解析方法、及び旬度解析プログラム | |
CN114997135A (zh) | 差异文本筛选方法、装置、设备及存储介质 | |
US20110044530A1 (en) | Image classification using range information | |
CN117493672A (zh) | 产品的推荐方法、装置、存储介质及电子设备 | |
CN107506735A (zh) | 照片归类方法以及归类系统 | |
JP2012058926A (ja) | キーワード付与装置及びプログラム | |
JP6276375B2 (ja) | 画像検索装置、画像表示装置、画像検索方法、および画像表示方法 | |
KR102214136B1 (ko) | Sns 기반 상품 이미지 검색 방법 | |
CN116662664A (zh) | 关键特征确定方法、装置、计算机设备、介质和程序产品 | |
JP2015187807A (ja) | 重要度算出装置、重要度算出装置方法、及び重要度算出装置システム | |
CN117076780A (zh) | 一种基于大数据的图书互借阅推荐方法及系统 | |
CN115545736A (zh) | 一种用户消费行为识别方法、装置、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080014545.8 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011514937 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13255324 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010845711 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10845711 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |