WO2012111275A1 - 画像評価装置、画像評価方法、プログラム、集積回路 - Google Patents
画像評価装置、画像評価方法、プログラム、集積回路 Download PDFInfo
- Publication number
- WO2012111275A1 WO2012111275A1 PCT/JP2012/000787 JP2012000787W WO2012111275A1 WO 2012111275 A1 WO2012111275 A1 WO 2012111275A1 JP 2012000787 W JP2012000787 W JP 2012000787W WO 2012111275 A1 WO2012111275 A1 WO 2012111275A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- evaluation
- images
- importance
- background
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
Definitions
- the present invention relates to a technique for creating an album by arranging appropriate images.
- Patent Document 1 describes a technique for supporting album creation.
- a target value is set in one image insertion area of an album, and an image close to the target value is displayed.
- the album will be easier to view if the images can be combined with each other rather than being arranged in isolation.
- Patent Document 1 the idea is to insert one image similar to the target value into one image insertion area, and there is no disclosure of a configuration in which a combination of images is considered.
- the present invention has been made based on such a background, and provides an image evaluation apparatus capable of evaluating selection of an image suitable for an album and arrangement of the selected image based on a combination between images. For the purpose.
- an image evaluation apparatus is configured by combining N frames (N is a natural number of 2 or more) for arranging images and a plurality of frames from among the N frames.
- Template storage means for storing a template having one or more frame sets to be performed and evaluation items relating to the feature amounts of the images arranged in the respective frame sets; acquisition means for acquiring a plurality of images; Evaluation means for generating a plurality of arrangement patterns when N images are selected from a plurality of images and arranged in the N frames, and an evaluation value for each generated arrangement pattern is calculated based on the evaluation item
- evaluation value storage means for storing an evaluation value for each arrangement pattern calculated by the evaluation means, wherein the evaluation means is the highest evaluation among the evaluation values stored in the evaluation value storage means Vs value And identifies the N images arranged in the arrangement pattern of.
- the image evaluation apparatus calculates an evaluation value based on an evaluation item obtained by combining the feature amounts of the images arranged in the frame set.
- the selection of images suitable for the album and the arrangement of the selected images can be evaluated based on the combination between images.
- Block diagram of an image evaluation system Overall flowchart Flow chart showing details of image acquisition, object, background and event extraction Image diagram showing specific examples of object clustering operations Diagram showing the data structure of object cluster information Image diagram showing a specific example of background feature extraction operation
- An image diagram showing a specific example of an object behavior calculation operation Flow chart showing details of scene transition calculation Image diagram showing a specific example of scene transition degree calculation
- Flowchart showing details of overall scene image calculation Image diagram showing a specific example of the operation for calculating the overall scene image Diagram showing data
- FIG. 1 is a block diagram showing the basic configuration of the image evaluation apparatus in the first embodiment.
- the image evaluation system 1 includes a storage medium 2, an image evaluation apparatus 3, and a display 4.
- the image evaluation apparatus 3 includes an image acquisition unit 10, an object extraction unit 20 (including an object feature amount extraction unit 21 and an object clustering unit 22), and an event extraction unit 30 (including a shooting date / time information extraction unit 31 and an event clustering unit 32).
- Background extraction unit 40 including background feature amount extraction unit 41
- image evaluation unit 50 including object importance level calculation unit 51 and background importance level calculation unit 52
- storage unit 90 template information storage unit 91, object cluster
- album Information selection unit 60 including event cluster selection unit 61 and template selection unit 62
- image set Preparative evaluation unit 70 object introduction degree calculation unit 71, a scene transition degree calculation unit 72, an object action calculator 73 includes a scene overview degree calculation unit 74
- the arrangement pattern evaluation unit 80 includes a display control unit 100.
- the image acquisition unit 10 acquires image data from the storage medium 2.
- the image acquisition unit 10 includes, for example, an SD card reader, and acquires image data from the storage medium 2 as an SD memory card inserted into the SD card slot.
- the object extraction unit 20 extracts objects from the acquired image data, and performs clustering of the extracted objects.
- the object feature amount extraction unit 21 cuts out an area where an object appears from the image for the image data acquired by the image acquisition unit 10 and extracts the feature amount of the object.
- the object clustering unit 22 performs clustering based on the extracted feature quantity, and stores information indicating the result in the object cluster information storage unit 92.
- the object is a human face.
- a k-means method As a clustering method for object feature values, a k-means method (see Reference 1), which is one of non-hierarchical methods (a method for performing clustering by giving a representative to each fixed cluster), is used. be able to.
- the background extraction unit 40 extracts the background feature amount of the background (the region excluding the region cut out by the object feature amount extraction unit 21 in the image) for the image data acquired by the image acquisition unit 10 and extracts the extracted information.
- the background feature amount information storage unit 94 As for the method of extracting the background feature amount, the image is divided, the color feature amount having a high frequency in the divided region is extracted as the representative color, and the histogram of the representative color is used as the background feature amount.
- the event extraction unit 30 classifies the image data acquired by the image acquisition unit 10 based on the event.
- the shooting date / time information extraction unit 31 extracts the shooting date / time information added to the image for the image data acquired by the image acquisition unit 10.
- the event clustering unit 32 clusters images into a plurality of events based on the extracted shooting date / time information. Then, information indicating the clustered result is stored in the event cluster information storage unit 96.
- an event clustering method a method can be used in which an arbitrary time is set as a threshold value, and the difference between shooting dates and times of images exceeds the threshold value is used as an event delimiter (see Reference 2). ).
- the image evaluation unit 50 includes an object importance level calculation unit 51 and a background importance level calculation unit 52.
- the object importance calculation unit 51 calculates an object importance indicating the importance of each object such as a person shown in the image, and the calculated information is stored in the object It is stored in the importance storage unit 93.
- a cluster having a higher appearance frequency of objects appearing in an image is calculated as a higher object importance.
- the number of images acquired by the image acquisition unit 10 ... 100 The number of images including a person a 30 sheets
- the number of images including a person b 20 sheets The number of images including a person c ... 10 sheets.
- the object importance calculation unit 51 calculates the importance of the person a as 30, the importance of the person b as 20, and the importance of the person c as 10 according to the number of appearing images (see FIG. 12).
- the size (occupancy) of a person appearing in each image may be calculated so that the importance increases as the occupancy increases.
- the background importance calculation unit 52 includes object cluster information stored in the object cluster information storage unit 92, event cluster information stored in the event cluster information storage unit 96, and background feature amount stored in the background feature amount information storage unit 94. Based on the above, the background importance of each image belonging to a certain event is calculated, and the calculated background importance is stored in the background importance storage unit 95. Details of this background importance calculation method will be described later with reference to FIG.
- the storage unit 90 includes a template information storage unit 91, an object cluster information storage unit 92, an object importance level storage unit 93, a background feature amount information storage unit 94, a background importance level storage unit 95, an event cluster information storage unit 96, and image set information.
- a storage unit 97 and an arrangement pattern storage unit 98 are included.
- the storage unit can be constituted by a RAM, for example.
- the album information selection unit 60 has a function of performing processing related to selection of information (events and templates) related to albums.
- the event cluster selection unit 61 has a function of selecting an event to which an image used for creating an album belongs.
- the event cluster selection unit 61 selects one type of event from these three types. This selection may be performed by displaying an event selection menu on the display 4 for the user and receiving an input from the input device.
- the template selection unit 62 selects one type of template from a plurality of types of templates stored in the template information storage unit 91.
- the template information storage unit 91 includes, for each template, (a) information related to a design indicating a frame layout and the like, and (b) information related to evaluation items related to a pair of images arranged in a pair of frames Is remembered.
- the arrangement pattern evaluation unit 80 calculates an evaluation value for each arrangement pattern (composite image set) when images belonging to the event cluster selected by the event cluster selection unit 61 are arranged with respect to the template selected by the template selection unit 62. Then, the calculated evaluation value is stored in the arrangement pattern information storage unit 98. Then, the arrangement pattern evaluation unit 80 creates an album corresponding to the arrangement pattern having the highest evaluation value.
- the arrangement pattern evaluation unit 80 calculates an evaluation value for each of the 20160 arrangement patterns according to the brute force algorithm.
- the image set evaluation unit 70 is in charge of calculating the details of the evaluation values.
- evaluation item 1 “Object referral degree”... Indicates the adaptability of the object in the image to the referral property.
- Decision factor of evaluation item 1 Higher evaluation as more types of people appear in a pair of images Higher evaluation as a person shown in a pair of images is more important
- Evaluation item 2 “Object behavior”: object in a scene Indicates the degree of behavior.
- both of the object behavior level and the overall scene image level of the evaluation items 2 and 4 pay attention to the similarity of the background, and it can be said that the degree of change related to the same object cluster between the images having similar backgrounds. .
- the object introduction degree calculation unit 71 the scene transition degree calculation unit 72, the object behavior degree calculation unit 73, and the overall scene image degree calculation unit 74 are respectively an object introduction degree, a scene transition degree, and an object behavior. And the overall image intensity of the scene are calculated.
- the display control unit 100 has a function of causing the display 4 to perform various displays.
- the display control unit 100 is in the template that is the mount of the album based on the evaluation value of the arrangement pattern stored in the arrangement pattern information storage unit 98 and the template information stored in the template information storage unit 91.
- the album with the image applied according to the arrangement pattern (for example, the arrangement pattern having the highest evaluation value) is displayed on the screen of the display 4. ⁇ Operation>
- the flow until the image arrangement pattern is evaluated will be described.
- FIG. 2 is a flowchart showing the overall processing flow.
- the image evaluation apparatus 3 performs image acquisition / object / background / event extraction (S21) and image evaluation (S22), acquires event cluster information from the event cluster information storage unit 96, and selects from the acquired event clusters.
- the user selects an event that the user wants to make an album (S23)
- the template information is acquired from the template information storage unit 91, and the user selects a template to be used as the album mount (S24).
- two images (a pair of images) are set from the image group belonging to the selected event cluster (S25), image set evaluation (S26) of the set pair of images is performed, and all the images belonging to the event cluster are evaluated.
- step S21 the image acquisition / object / background / event extraction process in step S21 will be described with reference to FIG.
- the image acquisition unit 10 acquires the image data stored in the storage medium 2 from the storage medium 2 (S31).
- the image data for three images A to C in which the person a and person b as shown in FIG. 4 are stored is stored in the storage medium 2. Assume that image data of A to C is acquired.
- the object extraction unit 20 cuts out a face region from the image for the acquired image data, and calculates a feature amount such as a face outline and a face feature amount such as a ratio (occupancy) of the face to the image. Extract (S32).
- the object clustering unit 22 performs clustering using the extracted face feature amount, and stores the object cluster information as a result of clustering in the object cluster information storage unit 92 (S33).
- the background feature amount extraction unit 41 of the background extraction unit 40 extracts the background feature amount for the acquired image data and stores it in the background feature amount information storage unit 94 (S34).
- the background feature amount is information regarding the color feature amount of the background region of the image (the region excluding the region cut out as an object in the image). Details thereof will be described later with reference to FIG.
- the shooting date / time information extraction unit 31 of the event extraction unit 30 extracts the shooting date / time information from the Exif (Exchangeable image file format) information attached to the image shot by the digital camera or the like for the acquired image data. (S35). Then, the event clustering unit 32 performs clustering using the extracted shooting date / time information, and stores the event cluster information, which is the clustered result, in the event cluster information storage unit 96 (S36).
- the object feature amount extraction unit 21 extracts four faces F1 to F4 from three images A to C (FIG. 4A). The cut-out and feature values of the faces F1 to F4 are extracted (FIG. 4B). Then, the object clustering unit 22 clusters the faces F2 and F3 of similar faces among the four faces F1 to F4 as the person a, and the faces F1 and F4 with the person b (FIG. 4C). ).
- FIG. 5 is a diagram showing the data structure of object cluster information indicating the result of object clustering (S33).
- the object cluster information is information indicating which person cluster each face appearing in the image belongs to, and includes items of “object cluster name”, “face”, “image”, and “face occupancy” .
- the background feature quantity extraction unit 41 divides each of the three images A to C (FIG. 6 (a)) into 32 regions. (FIG. 6 (b)), a color that appears most frequently in the background area segment other than the face is extracted as a representative color of the segment, and a color histogram of each image based on the extracted representative color is extracted as the background feature amount. (Fig. 6 (c)).
- each background feature amount is multiplied by (32 / number of background areas) (normalization). Specifically, 32/18, 32/25, and 32/22 are multiplied by the background feature amounts of the images A, B, and C, respectively.
- histograms of five representative colors (black, white, red, green, and blue) divided into the upper and lower parts of each image are calculated as background feature amounts of images A, B, and C, respectively. is doing.
- the background feature quantity extraction unit 41 stores each normalized background feature quantity in the background feature quantity information storage unit 94 (see FIG. 7).
- the shooting date and time extraction unit 31 extracts the shooting date and time from each of the four images A to D (FIG. 8A) (FIG. 8). 8 (b)).
- the shooting date and time can be extracted from Exif information added to an image shot with a digital camera or the like.
- the event clustering unit 32 clusters the images A and B of the images with the closest date and time among the four shooting dates and times as the event f and the images C and D with the event g (FIG. 8C).
- FIG. 9 is a diagram showing a data structure of event cluster information indicating a result of event clustering (S36).
- the event cluster information is information indicating which event cluster the image belongs to, and includes items of “event cluster name”, “image”, and “shooting date / time”.
- the image evaluation unit calculates the object importance for calculating the importance of the object shown in the image (S101), and then calculates the importance of the background of the image (S102).
- the process of calculating the object importance (S101) will be described with reference to FIG.
- the object importance degree calculation unit 51 acquires object cluster information from the object cluster information storage unit 92 (S111).
- the object importance calculation unit 51 calculates the object importance for each object cluster based on the acquired object cluster information (S112).
- the number of object clusters to which the object shown in the image belongs is counted, and the appearance frequency is obtained from the counted number. Then, the object importance level is calculated so as to be proportional to the appearance frequency.
- FIG. 12 is a diagram showing a data structure of object importance indicating the result of object importance calculation.
- the object importance degree stored in the object importance degree storage unit 93 is information indicating the importance degree of the object shown in the image, and includes items of “object cluster name”, “object importance degree”, and “rank”.
- the object importance levels of the person a, the person b, and the person c appearing in the image are set to 30, 20, and 10, and rankings based on the importance levels are given.
- the background importance calculation unit 52 acquires event cluster information from the event cluster information storage unit 96 and background feature amounts from the background feature amount information storage unit 94 (S131).
- the background importance calculation unit 52 specifies an image group belonging to the event cluster selected by the user in step S23 (see FIG. 2) based on the acquired event cluster information and the background feature amount. Then, the similarity of the background feature amount between any two images is calculated from the specified image group (S132).
- the background importance calculation unit 52 calculates, for each image in the image group belonging to the selected event cluster, the sum of the similarities of background feature amounts between the image and other images.
- the background frequency of the image is calculated (S133).
- the background frequency of image C is any of image C and images D to F other than image C. This is the total sum of the similarities of the background feature amounts combined with one of them.
- Background frequency of image C similarity of background feature quantity between images CD + similarity of background feature quantity between images CE + similarity of background feature quantity between images CF.
- the background importance calculation unit 52 acquires object cluster information from the object cluster information storage unit 92 and object importance from the object importance information storage unit 93 (S134).
- the background importance calculation unit 52 is configured to obtain a background image similar to the background in the image based on the acquired object cluster information, the object importance information, and the similarity of the background feature amount between the images calculated in S132.
- An object co-occurrence degree indicating the degree to which important persons appear together (co-occur) is calculated (S135).
- the background importance calculation unit 52 calculates the background importance based on the calculated background frequency and object co-occurrence of the image, and stores the background importance in the background importance information storage unit 95 (S136).
- image C to image F in FIG. 14A are images taken on the beach, whereas only image F is an image taken on the road.
- the background similarity in the range of 0 to 1, with 0 being the most dissimilar and 1 being the most similar
- the interval between images CE is 0.8
- the interval between images DE is 0.7.
- the background similarity when the two images including the image F are combined is a low value, and is 0.1 between all the images CF, between the images DF, and between the images EF.
- FIG. 14B shows an example of calculating the background frequency, and the background frequency (1.6) of the image E is between the images CE (0.8), between the images DE (0.7), and between the images EF (0). .1) is the sum of the similarities of the background feature values.
- FIG. 14C shows an example of calculating the person co-occurrence degree (object co-occurrence degree).
- the person co-occurrence degree is calculated by multiplying the importance degree of the person shown in the images C and D and the similarity of the calculated background feature amount to obtain the sum of the calculated values.
- the background importance is calculated by multiplying the calculated background similarity and the person co-occurrence degree (FIG. 14 (d)).
- the background importance of the images E and F is calculated from the images C to F.
- the background importance of the image E whose background is similar to the image of the person is calculated high, and the background importance of the image F is calculated low.
- FIG. 15 is a diagram showing a data structure of background importance indicating the result of background importance calculation (S136).
- the background importance is information indicating the importance of the background indicating the scene of the event in an area other than the object shown in the image, and includes items of “image ID”, “background importance”, and “rank”.
- the background importance levels 0.6 and 0.4 of the images A and B, and the ranks based on the background importance levels are stored.
- the image set evaluation unit 70 performs the following four types of evaluation based on the two images (image sets) set through steps S23 to S25 in FIG.
- an image set that shows the participants of the event
- an image set that shows the actions performed by the specific participant at the event
- an image set that shows the actions performed by the specific participant at the event
- an understandable image set that shows the images performed by the specific participant at the event
- the process (S161) for calculating the object introduction degree will be described with reference to FIG.
- the object introduction degree calculation unit 71 acquires object cluster information from the object cluster information storage unit and object importance from the object importance degree information storage unit (S171).
- the object introduction degree calculation unit 71 proceeds to step 162 in FIG.
- the object introduction degree is calculated by summing the importance of each object appearing in both images, and the calculated object introduction degree is stored in the image set information storage. Store (S173).
- Step 173 will be specifically described with reference to FIG. 18.
- the object introduction degree calculation unit 71 calculates, for example, an image set GH (image G and image H from eight images G to N (FIG. 18A)). ), The sum of the object importance levels of the persons b and c shown in the image G and the person a shown in the image H is obtained, and this sum is calculated as the object introduction degree of the image set (FIG. 18B). .
- an image set GH and an image set JK are selected from eight images G to I on the beach, J to M on the sea, and N on the road. (Combination) object introduction degree is calculated.
- the object introduction degree of the image set GH including all the family members is calculated high, and the object introduction degree of the image set JK including no family members is calculated low.
- the object behavior level calculation unit 73 includes object cluster information from the object cluster information storage unit 92, object importance from the object importance level information storage unit 93, background feature amount from the background feature amount information storage unit 94, and background importance level information storage unit 95. From the background importance, the shooting date / time information is acquired from the event cluster information storage unit 96 (S191).
- the object behavior level calculation unit 73 proceeds to step 163 in FIG. If the object is shown (S192: Yes), the similarity of the background feature amount between the images is calculated (S193), and then the shooting date interval between the images is calculated (S194). Then, based on the calculated background feature amount similarity, the interval between shooting dates and times, the object importance of both images, and the background importance of both images, the object behavior is calculated, and the calculated object behavior is calculated as image set information.
- the data is stored in the storage unit 97 (S195). This takes into account the tendency of the user to look back on what the family members who participated in the event do, and the more the image set that shows the same important person in the same scene, the more important the image set.
- Steps S193 to S195 will be specifically described with reference to FIG.
- 20 is an image group belonging to the event cluster of a certain event.
- the object behavior level calculation unit 72 specifies the image sets HJ, JL, and HL in which the same object is captured from the images G to N as the calculation target.
- FIG. 20B shows the flow of calculating the object behavior for the image sets HJ and JL among these image sets.
- the object importance of the person a appearing in both images is extracted, the average of the background importance of the images H and J is calculated, and the background between the images HJ is calculated.
- the similarity of the feature amount is calculated, and the shooting interval between the images HJ is calculated.
- a histogram intersection (see Reference 3) can be used as a method of calculating the similarity of the background feature amount.
- the shooting interval between images (0 to 1, where 0 indicates that the shooting interval is the farthest and 1 indicates that it is the closest)
- the shooting date and time of all the images belonging to the event Based on this, the shooting interval between images is calculated, and a normalized value is used so that all the calculated shooting intervals are in the range of 0 to 1.
- the object behavior of the image set HJ is calculated by multiplying the average of the extracted object importance, the calculated background importance, the similarity of the background feature amount, and the shooting date / time interval.
- an image set HJ and an image set JL are selected from eight images G to I on the beach, J to M on the sea, and N on the road.
- the degree of object behavior of the combination of images L) is calculated.
- the object behavior of the image set JL in which the same person is captured in a similar background scene is calculated high, and the object behavior of the image set HJ in which the same person is captured in a different background scene is calculated low.
- the process of calculating the scene transition degree (S163) will be described with reference to FIG.
- the scene transition degree calculation unit 72 acquires background feature amounts from the background feature amount information storage unit 94, background importance levels from the background importance level information storage unit 95, and shooting date / time information from the event cluster information storage unit 96 (S211).
- the scene transition degree calculation unit 72 uses, for example, an image set IM (image I and image I) from eight images G to N (FIG. 22A). In the combination of images M), the average of the background importance of both images is calculated, and the dissimilarity of the background feature quantity between the images and the interval between the photographing dates are calculated.
- the dissimilarity of the background feature quantity between images is calculated by subtracting the background similarity from 1. That is, the dissimilarity of the image set IM is 0.8 obtained by subtracting the similarity 0.2 of the image set IM from 1.
- the scene transition degree of the image set IM is calculated by multiplying the average of the calculated background importance, the dissimilarity of the background feature amount, and the shooting date and time interval (FIG. 22B).
- an image set IM and an image set MN are selected from eight images of images G to I on the beach, images J to M on the sea, and an image N on the road. The degree of scene transition is calculated.
- Scene transition is highly evaluated for combinations of image sets that have a high average background importance and a high possibility of scene transition.
- the high average of the background importance level is because there is a high possibility that the image set has a large number of shots and is considered important by the user.
- the scene transition degree is a variable that focuses on the transition of the scene rather than the person
- the image set in which the person is captured may be excluded from the calculation target. That is, among the eight images in FIG. 22A, only the image set in which the three images I, M, and N in which no person is captured may be set as the calculation target. Alternatively, even if not excluded from the calculation target, the image set including the object may be evaluated low.
- a higher scene transition degree is calculated for the image set IM having a higher background importance and a larger interval between shooting dates and times as compared to the image set MN.
- the background whole image degree calculation unit 74 includes a background feature amount from the background feature amount information storage unit 94, a background importance level from the background importance level information storage unit 95, a shooting date / time information from the event cluster information storage unit 96, and an object cluster information storage unit 92.
- the object cluster information is acquired from the object cluster information and object importance information storage unit 93 (S231).
- the entire scene image degree calculation unit 74 proceeds to step S28 in FIG. If the object is reflected in any of the images (S232: Yes), the object occupation difference between the images is calculated (S233).
- the object occupancy difference is the difference in the size at which the object appears between the images with respect to the object that appears the largest in both images.
- the background similarity between the images is calculated (S234).
- the interval of the shooting date and time between images is calculated (S235). Then, based on the calculated object occupancy difference, background similarity, and shooting date / time interval, the background overall image degree is calculated and stored in the image set information storage unit 97 (S236).
- Steps S233 to S236 will be specifically described with reference to FIG. 24.
- the entire scene image degree calculation unit 74 converts, for example, an image set HI (combination of images H and I) from eight images G to N.
- image set HI combination of images H and I
- the degree of object occupancy difference between images, the similarity between background features, and the interval between shooting dates are calculated.
- the degree of object occupancy difference when a person is shown in both images, the difference in the size of the person in both images is calculated. However, in the case of the image set HI, the person is shown only in the image H. The size (occupancy) of the person in the image H is defined as the object occupation difference. Then, by multiplying the calculated object occupation difference, the similarity of the background feature quantity, and the interval of the photographing date and time, it is calculated as the overall scene image degree of the image set HI (FIG. 24B).
- a scene of an image set HI and an image set LM (combination of images L and M) is selected from eight images G to I on the beach, images J to M on the sea, and an image N on the road.
- the overall image quality is calculated.
- the entire scene image degree of the image set HI in which a person is photographed large in one side in the same scene is calculated high, and the entire scene image degree in the image set LM in which a person is photographed small in one side is calculated low.
- the overall scene image degree may be calculated using the object importance degree that appears in the image set so that the overall image degree of the image set in which an important person is captured is highly evaluated.
- FIG. 25 is a diagram showing a data structure of an object introduction degree, an object action degree, a scene transition degree, and a scene overall image degree indicating the result of image set evaluation calculation.
- the image set information storage unit 97 is information that evaluates an image set in which different important persons are captured from a combination of two images belonging to an event cluster, and has a structure like a table 97a in FIG. Remember me.
- the object behavior level is information for evaluating an image set in which the same important person is captured from a combination of two images belonging to the event cluster, and is stored in a structure like a table 97b in FIG.
- the scene transition degree is information for evaluating an image set in which different scenes are captured from a combination of two images belonging to the event cluster, and is stored in a structure like a table 97c in FIG.
- the overall image intensity of the scene is information for evaluating an image set in which an important person appears in the same scene from a combination of two images belonging to the event cluster, and has a structure like a table 97d in FIG. Remember.
- an object introduction degree, an object action degree, a scene transition degree, and an entire scene image degree calculated by a combination of eight images G to N are stored in a table structure.
- the values corresponding to those in FIGS. 18, 20, 22, and 24 are simply entered, and the remaining values are omitted. If the evaluation item is a value or a value that is not calculated, “-” is entered.
- the arrangement pattern evaluation unit 80 acquires template information from the template information storage unit 91, image set evaluation information from the image set information storage unit 97, and event cluster information from the event cluster information storage unit 96 (S261).
- the arrangement pattern evaluation unit 80 selects images corresponding to the number of frames included in the template information selected by the user from the group of images belonging to the event cluster selected by the user, and creates frames. An arrangement pattern for the arrangement is generated (S262).
- the evaluation value of the set arrangement pattern is calculated based on the evaluation item defined in the image set between two frames included in the template information, and stored in the arrangement pattern information storage unit 98 (S263).
- the process returns to step 262, and the evaluation values are calculated for all arrangement patterns. If completed (S264: Yes), the arrangement pattern evaluation unit 80 creates an album using the arrangement pattern of the highest order (S265).
- the display control unit 100 displays the created album on the display 4 (S266).
- FIG. 27 is a diagram showing a data structure of template information.
- the template information includes “design (layout)”, “evaluation item”, and “evaluation item between frames”.
- FIG. 27 shows only the details of the template information of template A as an example, but templates B and C have the same data structure.
- the “design” of template A includes information on frame layout (frame coordinate position, size, angle), information on individual frames (frame background color, pattern, frame decoration, etc.), album mount Includes information about the design (mount size, background color of the mount, etc.).
- FIG. 27 shows a simple image, but actually, it is assumed that various parameters are stored in a predetermined database format.
- the “evaluation item” of template A includes the type of condition relating to the feature amount of the image.
- a condition for combination of frames four types of image combinations to be applied are object introduction degree (OI), object behavior degree (OA), scene transition degree (ST), and overall scene image degree (SO). There are conditions.
- the “evaluation item between frames” of the template A includes a correspondence table of evaluation items to be applied to a pair of images arranged in a pair of frames in order to derive an arrangement pattern in which events can be easily looked back.
- this correspondence table for example, an object introduction degree (OI) is associated with a pair of frames ab.
- OI object introduction degree
- Steps S262 to S263 will be specifically described with reference to FIG.
- Template A has six frames from frame a to frame f.
- the arrangement pattern evaluation unit 80 calculates the arrangement pattern evaluation value for each arrangement pattern based on the condition between frames defined by the “evaluation item between frames” of the template A (FIG. 29).
- the arrangement pattern evaluation value is defined in “Evaluation item between frames” of template A (see FIG. 27).
- ⁇ OI between frames ab ⁇ SO between frames ac -SO between frames bc -ST between frames cd -SO between frames de -SO between frames df ⁇ OA between frames ef It is obtained by adding all the evaluation items.
- the arrangement pattern evaluation values of the arrangement pattern P1 and the arrangement pattern P20160 are calculated, and the evaluation value 200 of the arrangement pattern P20160 has a higher evaluation value 100 than the arrangement pattern P1.
- FIG. 30 is a diagram showing a data structure of an arrangement pattern evaluation value indicating a result of arrangement pattern evaluation calculation.
- the arrangement pattern evaluation value is information indicating evaluation of an image combination based on the selected event image group and the template and capable of efficiently reflecting the event.
- the “configuration pattern ID” and the “template” indicating the frame included in the template It includes items of “A pattern”, “placement pattern evaluation value”, and “rank”.
- the arrangement pattern evaluation values 100 and 200 with respect to the template ⁇ of the arrangement patterns P1 and P20160 in FIG. 29 and the ranks based on the arrangement pattern evaluation values are stored.
- the arrangement pattern P20160 having the highest ranking (highest evaluation value) satisfies the evaluation items of the template A most, so the album according to this arrangement pattern P20160 is in line with the user's request. Can be.
- the number of images is as small as eight.
- the user manually creates an album. Since this is very troublesome, it is effective to apply this embodiment.
- FIG. 31 shows another example of the template A stored in the template information storage unit 91. This is basically the same as FIG. 27 except that the template A includes a column “evaluation item, comment”.
- This “evaluation item, comment” column indicates the correspondence between the comments for the evaluation items. For example, a comment “I was such a member” is associated with the degree of object introduction (OI).
- ⁇ Comments may be displayed between frames of automatically created albums using such correspondence.
- FIG. 32A shows an album 320a created by using the arrangement pattern P20160 having the highest (first) arrangement pattern evaluation value
- FIG. 32B shows the lowest arrangement pattern evaluation value
- the album 320b created using the arrangement pattern P1 of (20160) is shown.
- comments corresponding to the evaluation items between the frames defined in the template A in FIG. 31 are displayed between the frames ab, such as a comment “I was such a member”.
- an image H image G showing a person is arranged in each frame ab, and is in line with the comment “I was such a member”.
- an image L image I in which no person is shown is arranged in each frame ab, and it is not along with the comment “I was such a member”.
- FIG. 33 shows a template selection screen 330 that is displayed on the display 4 by the template selection unit 62.
- the template selection unit 62 reads out the three types of templates A to C stored in the template information storage unit 91 and displays them on the screen 330. And the selection of a template is received from a user via the input device which is not illustrated.
- (3) Template Example Template A described in the embodiment uses four types of evaluation items (see FIG. 31), but the four types of evaluation items are not necessarily required, and any of these evaluation items can be evaluated.
- a template may be formed by combining items.
- FIG. 34A shows a data structure of template information of template B.
- FIG. The template B is a simple template including only two frames ab, and an object introduction degree is associated as an evaluation item between the frames ab.
- FIG. 34B shows an example of an album created using the template B.
- the album of the type in which the image is pasted on the frame on the mount has been described as an example.
- the present invention is not limited to this, and a slide show type album may be used.
- FIG. 35A shows the data structure of the template information of the template B ′.
- the “design (layout)” of the template B ′ includes information indicating slides corresponding to the slide numbers 1 to 5 in the slide show. For example, slide number 3 is associated with slide frame a, and slide number 4 is associated with slide frame b.
- the “evaluation item between frames” of the template B ′ indicates that the pair of slide frames ab is evaluated using the object introduction degree.
- FIG. 35B shows an example of an album created using the template B.
- (5) Flow of Arrangement Pattern Evaluation Value Calculation Process In the embodiment, as shown in S262 to S264 of FIG. 26, the respective evaluation values are calculated for all generated arrangement patterns (brute force method).
- the present invention is not limited to this.
- the evaluation value calculated in step S263 is equal to or greater than the threshold value
- the evaluation value calculation process may be terminated at that time and the process may proceed to step S265.
- the “evaluation item between frames” (see FIG. 27) of the template is composed of a set of two frames. I can't. More than two frames may be used as a frame set. For example, three frames may be configured as a frame set.
- a certain evaluation item is defined for a frame set including three frames 123. Then, the evaluation level may be evaluated by combining the feature amounts of the three images ABC arranged in the frame set.
- the evaluation values of the three image sets may be calculated in advance and used for evaluating the arrangement pattern. However, in order to reduce the amount of computation (so that the evaluation values of the two image sets can be diverted), the evaluation values of the three image sets of the image ABC are multiplied by the images AB, BC, and AC. The calculation may be performed using an average value of evaluation values of two image sets.
- a template in which two frame sets and three frame sets are mixed may be used.
- an evaluation item of a scene transition degree (ST) is associated with a frame set composed of frames 123.
- This scene transition degree can be obtained by taking the three of the scene transition degree ST 12 of the frame 12 , the scene transition degree ST 23 of the frame 23 , and the scene transition degree ST 13 of the frame 13.
- Such a frame set is useful for extracting a combination of images in which three places move from an image of an event played in three places.
- an evaluation item of the object introduction degree is associated with a frame set including the frame 1234.
- a frame set is useful for extracting a combination of images in which each member of the family is captured from the images of the family taken by the user of the family of four.
- Frames Not Participating in Template Frame Set In the embodiment, seven frames of frames ab, ac, bc, cd, de, df, and ef in the “evaluation item between frames” (see FIG. 27) of the template. Evaluation items are associated with sets.
- all the frames a to f are frames constituting at least one frame set.
- the frames constituting the seven frame sets cover all the six N frames.
- an evaluation value for each arrangement pattern may be calculated for template A and template D.
- the template A and the template D are automatically selected (or accepted from the user) in step S24 in FIG. 2, and the processes in steps S261 to S264 in FIG. 26 are repeated for each selected template. Can be realized.
- template D in the example of FIG. 36 is the same as the template A in that it has six frames, but the design and evaluation items between frames are different from the template A.
- Arrangement pattern used for album creation In the embodiment, the album is created using the arrangement pattern of the highest order (FIG. 26: S265), but the present invention is not limited to this. For example, three corresponding albums may be created using the top three placement patterns, and a list of created albums may be displayed to the user.
- Example of a single frame In the embodiment, the evaluation value is calculated by a combination of images to be inserted into a plurality of frames (frame sets). Do not mean.
- one evaluation item may be associated with one frame (single frame) in a part of the template frame.
- 37A includes eight frames (frames 1 to 8), and evaluation items of person importance and frame 2 are associated with frame 1 and frame 2, respectively.
- the frames 1 and 2 have a one-to-one correspondence between the frames and the evaluation items.
- Frames 3 and 4 are associated with the object introduction degree (OI), frames 5 and 6 are associated with the scene transition degree (ST), and frames 7 and 8 are associated with the overall scene image degree (SO). .
- FIG. 37B shows an example of the arrangement pattern of the template D and its evaluation value.
- (13) Relationship between frame and evaluation item In the embodiment, it has been described that one evaluation item is mainly associated with one frame set. However, a plurality of evaluation items may be associated with one frame set. Further, a plurality of evaluation items may be associated with the single frame described in (12).
- (14) Reduction of Number of Arrangement Patterns In the embodiment, it is assumed that all arrangement patterns when an image is arranged in a template frame are generated (such as S264 in FIG. 26). In addition, the number of arrangement patterns for calculating the evaluation value may be reduced.
- the first reduction technique is to narrow down the number of images used for generating the arrangement pattern based on the background importance of each image. This will be described with reference to FIG.
- the arrangement pattern evaluation unit 80 sorts the background importance of each of the 30 images in descending order and narrows them down to the top 15 (S391). Then, the arrangement pattern evaluation unit 80 uses the narrowed-down image for generating the arrangement pattern (S392).
- 32,760 15 ⁇ 14 ⁇ 13 ⁇ 12).
- the second reduction method is to divide an image group into scenes, narrow down the scenes based on the average scene importance of the image groups of each scene, and generate an arrangement pattern using the narrowed down scenes. This will be described with reference to FIG.
- the arrangement pattern evaluation unit 80 divides the image into scenes 1 to 8, and averages the background importance of the images in the scene for each scene. The value is obtained as the scene importance, and the scene is narrowed down based on the obtained scene importance.
- the arrangement pattern evaluation unit 80 uses 21 images in the scenes 2, 4, 6, 7, and 8 having relatively high scene importance for generating the arrangement pattern (S402).
- 143,640 21 x 20 x 19 x 18).
- the scene importance calculation method is not limited to the method using the background importance of the image described above, but a calculation method that takes into account the importance of the object of the image, a method that calculates the preference of the user, etc. Other general methods may be used.
- Examples of Formulas Hereinafter, examples of formulas for evaluating the values described in the embodiments will be described.
- the background similarity can be obtained by equation (1) in FIG.
- the area i is obtained when the image is divided (segmented). Abs is a function for obtaining an absolute value.
- the photographing time interval can be obtained by the equation (2) in FIG.
- the degree of object introduction can be obtained from equation (5) in FIG.
- the object behavior level can be obtained by Expression (6) in FIG.
- the scene transition degree can be obtained by Expression (7) in FIG.
- the arrangement pattern evaluation value can be obtained by Expression (9) in FIG. (16)
- an album has been described as an example of a work by image.
- the image is not limited to a book-type album, and an optimal image is available in a slide show format in which photos transition or a movie format in which photos operate with various animations.
- a combination may be selected to make a work.
- an SD memory card has been described as an example of a storage medium.
- the present invention is not limited to this as long as it is a recording medium.
- Smart media compact flash (registered trademark), memory stick (registered) Trademark), SD memory card, multimedia card, CD-R / RW, DVD ⁇ R / RW, DVD-RAM, HD-DVD, BD (Blu-ray Disc) recording medium, or the like may be used.
- the image evaluation apparatus of the embodiment may be typically realized as an LSI (Large Scale Integration) that is an integrated circuit. Each circuit may be individually configured as one chip, or may be integrated into one chip so as to include all or some of the circuits. Although described as LSI here, it may be called IC (Integrated Circuit), system LSI, super LSI, or ultra LSI depending on the degree of integration.
- LSI Large Scale Integration
- circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- An FPGA Field Programmable Gate Array
- reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- a control program including a program code for causing a processor of various devices such as a computer and various circuits connected to the processor to execute the processing related to the image evaluation shown in the embodiment is recorded on a recording medium. Or can be distributed and distributed via various communication channels.
- Such recording media include smart media, compact flash (registered trademark), memory stick (registered trademark), SD memory card (registered trademark), multimedia card, CD-R / RW, DVD ⁇ R / RW, DVD- There are RAM, HD-DVD, BD (Blu-ray Disc), etc.
- the distributed and distributed control program is used by being stored in a memory or the like that can be read by the processor, and the processor executes the control program to perform various functions as shown in the embodiment. It will be realized.
- ⁇ Supplement 2> The present embodiment includes the following aspects.
- the image evaluation apparatus includes one or more frames configured by combining N frames (N is a natural number of 2 or more) for arranging images, and a plurality of frames among the N frames.
- a template storage unit for storing a template having a frame set and an evaluation item relating to the feature amount of each image arranged in each frame set; an acquisition unit for acquiring a plurality of images; and a plurality of acquired images
- An evaluation unit that generates a plurality of arrangement patterns when N images are selected from the N frames and arranged in the N frames, and an evaluation value for each generated arrangement pattern is calculated based on the evaluation items; and the evaluation unit Evaluation value storage means for storing an evaluation value for each arrangement pattern calculated by the evaluation means, wherein the evaluation means is an arrangement corresponding to the highest evaluation value among the evaluation values stored in the evaluation value storage means Pattern And identifies the N images of the placed.
- the evaluation item related to the feature amount of each image arranged in the frame set is the introduction property of the object included in the image arranged in the frame set or the transition of the scene of the image arranged in the frame set. It may be included.
- An object clustering unit that clusters objects included in the plurality of images is provided, and the evaluation unit is configured such that, in the images arranged in the frame set, the object clusters of the objects included in the images are different.
- the introduction of the object may be highly evaluated.
- This configuration contributes to the creation of an album suitable for introducing objects such as people.
- It includes an object importance degree calculating means for calculating the importance degree of each clustered object cluster, and the evaluation means has a higher importance of the object cluster of the object included in the image arranged in the frame set,
- the introduction property of the object may be highly evaluated.
- An extraction unit that extracts a background feature amount from each of the plurality of images, and the evaluation unit includes a dissimilarity of a background feature amount of each image in an image arranged in the frame set, or You may evaluate the transition of the said scene based on the magnitude
- This configuration can contribute to the creation of an album suitable for introducing scene changes.
- a background importance calculation unit that calculates a background importance for each of the plurality of images is provided, and the evaluation unit has a high background importance of each image in the image arranged in the frame set.
- the transition of the scene may be highly evaluated.
- An object clustering means for clustering objects included in the plurality of images is provided, and the evaluation item relating to the feature amount of each image arranged in the frame set is the same object cluster between images having similar backgrounds It does not matter even if the degree of change related to is included.
- An extraction unit that extracts a background feature amount from each of the plurality of images, and the evaluation unit includes the same object cluster included in each image in the images arranged in the frame set, The degree of change may be evaluated higher as the background feature amount of each image is more similar.
- This configuration can contribute to the creation of an album suitable for introducing the behavior of objects such as people.
- Object importance degree calculating means for calculating the importance degree of each clustered object cluster, and background importance degree calculating means for calculating the importance degree of the background for each of the plurality of images, wherein the evaluation means includes The higher the importance of the object cluster of the object included in the image arranged in the frame set and the higher the importance of the background of the image arranged in the frame set, the higher the degree of change. You may evaluate it.
- (10) Computation means for calculating an occupied area occupied by an object from each of the plurality of images, wherein the evaluation means includes objects having the same object cluster included in each image in the images arranged in the frame set. The degree of change may be evaluated higher as the occupied area is different.
- This configuration can contribute to the creation of an album suitable for introducing the entire scene.
- Object importance calculation means for calculating importance of each clustered object cluster, and background importance calculation means for calculating background importance for each of the plurality of images, the evaluation means The higher the importance of the object cluster of the object included in the image arranged in the frame set, and the higher the importance of the background of the image arranged in the frame set, the higher the degree of change. You may evaluate it.
- the evaluation means For each of the plurality of images, there is provided background importance calculation means for calculating the importance of the background, and the evaluation means is based on the importance of the background of each image in the images arranged in the frame set. Thus, the evaluation value may be calculated.
- the image processing apparatus includes an extraction unit that extracts a background feature amount from each of the plurality of images, and the background importance calculation unit is configured to calculate each of an image to be calculated and another image with a similar shooting date and time.
- the importance level of the background may be calculated based on the similarity with the background feature value and the similarity between the background feature value of the one image and the background feature value of the image including the important person. Absent.
- Display control means for displaying information corresponding to the evaluation item related to the frame set based on calculation of the evaluation value in association with the image arranged in the frame set may be provided.
- This configuration can inform the user of the relationship between the images arranged in the frame set.
- the frames constituting the one or more frame sets may cover all of the N frames.
- an organic relationship can be provided between images arranged in all N frames.
- the evaluation unit may perform narrowing down from a plurality of acquired images, and may select an image after narrowing down as the selection target.
- This configuration can contribute to a reduction in the processing load for calculating the evaluation value.
- a background importance degree calculating unit that calculates a background importance level for each of the plurality of images is provided, and the evaluation unit performs the narrowing down based on the background importance levels of the plurality of images. It doesn't matter.
- the evaluation unit may divide the plurality of images into a plurality of scenes and perform the narrowing down based on the importance of each divided scene.
- An image evaluation method includes an acquisition step of acquiring a plurality of images, a reference step of referring to a template stored in a template storage unit, and the template for arranging images N (N is a natural number of 2 or more) frames, one or more frame sets configured by combining a plurality of frames from the N frames, and the feature amount of each image arranged in each frame set And an arrangement pattern when N images are selected from a plurality of images acquired based on the referenced template and arranged in the selected N frames.
- a program according to the present invention is a program for causing a computer to execute an image evaluation process, wherein the computer acquires an acquisition step of acquiring a plurality of images, and a reference step of referring to a template stored in a template storage unit
- the template includes N frames (N is a natural number of 2 or more) for arranging images, and one or more frame sets configured by combining a plurality of frames from the N frames.
- an evaluation item related to the feature amount of each image arranged in each frame set, and further, N images are selected from a plurality of images acquired based on the referenced template, A plurality of arrangement patterns when arranged in the selected N frames are generated, and an evaluation value for each of the generated arrangement patterns is calculated based on the evaluation items.
- an evaluation value storage step for storing an evaluation value for each arrangement pattern calculated in the evaluation step, and an arrangement pattern corresponding to the highest evaluation value among the evaluation values stored in the evaluation value storage step A specific step of specifying N images arranged in is performed.
- An integrated circuit includes N frames (N is a natural number of 2 or more) for arranging images, and one or more frames configured by combining a plurality of frames from the N frames.
- a template storage unit for storing a template having a set and an evaluation item related to the feature amount of each image arranged in each frame set; an acquisition unit for acquiring a plurality of images; and a plurality of acquired images
- a plurality of arrangement patterns when N images are selected and arranged in the N frames, and an evaluation value for each of the generated arrangement patterns is calculated based on the evaluation items; and the evaluation means
- Evaluation value storage means for storing the calculated evaluation value for each arrangement pattern is provided, and the evaluation means sets the arrangement pattern corresponding to the highest evaluation value from among the evaluation values stored in the evaluation value storage means.
- the one or more frame sets may be composed of two frames.
- the evaluation item includes a plurality of types of evaluation items, and at least one of the plurality of types of evaluation items is defined as the evaluation item in each of the one or more frame sets. It doesn't matter.
- ⁇ References> (1) Reference 1 Face recognition Kazuhiro Hotta using weighted matching based on Gabor feature information (Saitama University (Japan Society for the Promotion of Science)) IEICE technical report. HIP, Human Information Processing 100 (34) pp.31-38 20000504 (2) Reference 2 AJohn C. Platt, Mary Czerwinski, Brent A.
- an album that can efficiently look back on an event can be created from an enormous amount of image content held by a plurality of users such as family members. Compared with this evaluation method, the user can easily create and browse a high-quality album from various event image groups whose composition is not determined.
- the image evaluation apparatus preferentially selects and selects an image combination from four viewpoints based on a person and a scene in which an important scene and a family member are captured from an evaluation result based on the combination of images. Display as an album. For this reason, a user can look back on an event efficiently through an album. It is also useful for stationary terminals such as personal computers and server terminals. Furthermore, it is useful as a mobile terminal such as a digital camera or a mobile phone.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Processing Or Creating Images (AREA)
- Editing Of Facsimile Originals (AREA)
- Image Analysis (AREA)
Abstract
Description
(実施の形態1)
<構成>
以下、実施の形態1について図面を参照しながら説明する。
・画像取得部10が取得した画像の枚数・・・100枚
・人物aが写る画像の枚数・・・30枚
・人物bが写る画像の枚数・・・20枚
・人物cが写る画像の枚数・・・10枚
とする。この場合、オブジェクト重要度算出部51は、出現する画像枚数に応じ、人物aの重要度を30、人物bの重要度を20、人物cの重要度を10と算出する(図12参照)。
(a)イベントクラスタ情報、(b)オブジェクトクラスタ情報、(c)背景特徴量、(d)オブジェクト重要度、(e)背景重要度、これら(a)~(e)の要素を用いて、一対の枠に対して配置される一対の画像(画像セット)の評価を行うこととなる。
(評価項目1)「オブジェクト紹介度」・・・画像に写るオブジェクトの紹介性への適応度を示す。
(評価項目1の判断要素)
一対の画像に、より多種類の人物が写っているほど高評価
一対の画像に、写っている人物がより重要なほど高評価
(評価項目2)「オブジェクト行動度」・・・ある場面におけるオブジェクトの行動の度合いを示す。
(評価項目2の判断要素)
両方の画像に同一人物が写っていることが評価の前提条件
・写っている同一人物の重要度が高いほど高評価
・両方の画像の背景重要度が高いほど高評価
・一対の画像間の背景がより類似しているほど高評価(画像間で同じ場面との推測が働くため)
・一対の画像間の撮影日時がより近いほど高評価(日時が近いならば、両方の画像が同じ場面で撮影された一連の画像の関係にあるとの推測が働くため)
(評価項目3)「場面遷移度」・・・場面間の移り変わりの度合いを示す。
(評価項目3の判断要素)
・両方の画像の背景重要度が高いほど高評価
・一対の画像間の背景が非類似であるほど高評価(画像間で場面が異なるとの推測が働くため)
・一対の画像間の撮影日時がより遠いほど高評価(日時が遠いならば、その日時の間で撮影者が移動するなどして、画像間で場面が異なるとの推測が働くため)
(評価項目4)「場面全体像度」・・・特定の場面を振り返るための評価項目であり、ある人物が寄りと引きで写る場合に、この寄り引きの対照性の度合いを示す。
(評価項目4の判断要素)
一対の画像の少なくとも1枚の画像に人物が写っていることが評価の前提条件
・両方の画像の背景重要度が高いほど高評価
・一対の画像間に写っている人物の大きさが異なるほど高評価(寄り引きを示す)
・一対の画像間の背景がより類似しているほど高評価
・一対の画像間の撮影日時がより近いほど高評価
以上が4種類の評価項目1~4の概要である。なお上に示した各評価項目1~4の判断要素中で列挙したものはあくまでも一例であり、これに限られるものではない。
<動作>
以下、画像の配置パターンを評価するまでの流れについて説明する。
そして、オブジェクトクラスタリング部22は、4つの顔F1~F4のうち、類似している顔同士の顔F2、F3を人物aとして、また顔F1、F4を人物bとクラスタリングする(図4(c))。
画像Cの背景頻度=画像CD間の背景特徴量の類似度+画像CE間の背景特徴量の類似度+画像CF間の背景特徴量の類似度
となる。
これは、ユーザがイベントに参加した家族メンバが何をしたのかを振り返る傾向を考慮したものであり、同一の場面で同一の重要な人物が写る画像セットであるほど、重要な画像セットと考える。
・フレームab間のOI
・フレームac間のSO
・フレームbc間のSO
・フレームcd間のST
・フレームde間のSO
・フレームdf間のSO
・フレームef間のOA
の全評価項目を加算することにより求める。
<補足1>
以上、本実施の形態について説明したが、本発明は上記の内容に限定されず、本発明の目的とそれに関連又は付随する目的を達成するための各種形態においても実施可能であり、例えば、以下であっても構わない。
(1)アルバムの表示例
図31は、テンプレート情報記憶部91が記憶するテンプレートAの別の例を示すものである。基本的には図27と同様であるが、テンプレートAが「評価項目,コメント」という欄を含む点が異なっている。
(2)テンプレートの選択例
図2のステップS24で説明したテンプレート選択のユーザインターフェイスの例について説明を加える。
(3)テンプレートの例
実施の形態で説明したテンプレートAは、4種類の評価項目を用いていたが(図31参照)、4種類の評価項目がすべて必須というわけではなく、そのうちの任意の評価項目を組み合わせてテンプレートとしてもよい
図34(a)に、テンプレートBのテンプレート情報のデータ構造を示す。テンプレートBは、2個のフレームabだけを含む単純なものであり、フレームab間の評価項目としてオブジェクト紹介度が対応付けられている。
(4)アルバムの例
実施の形態では、台紙上のフレームに画像を貼り付けるタイプのアルバムを例に挙げて説明したが、これに限らず、スライドショータイプのアルバムであっても構わない。
例えば、スライド番号3には、スライド枠aが対応付けられており、スライド番号4には、スライド枠bが対応付けられている。
(5)配置パターン評価値算出の処理の流れ
実施の形態では、図26のS262~S264に示すように、生成した配置パターンすべてについてそれぞれの評価値を算出するとしている(総当たり法)。
(6)テンプレートの枠セットを構成する枠の数
実施の形態では、テンプレートの「フレーム間の評価項目」(図27参照)は2個の枠のセットから構成されていたが、2個に限られない。2個より大きい個数の枠を枠セットとしてもよい。例えば、3個の枠を枠セットとして構成してもよい。
(7)テンプレートの枠セットに関与しない枠
実施の形態では、テンプレートの「フレーム間の評価項目」(図27参照)でフレームab,ac,bc,cd,de,df,efの7個の枠セットに評価項目が対応付けられている。
(8)テンプレートの「フレーム間の評価項目」
図27の「フレーム間の評価項目」に重みの項目を追加してもよい。
(9)複数のテンプレートを用いた評価
実施の形態では、配置パターン評価(図26)では、1つのテンプレートを用いて評価するとしたが、これに限らず複数のテンプレートを評価対象としてもよい。
(10)配置パターンの数え方
実施の形態では、テンプレートAのフレームa~fの6個のフレームを区別していたので、8枚の画像を6個のフレームに配置するパターンは、8P6=20160通りとしている、このような順列で考えるやり方は、各フレームの個性(テンプレートにおけるフレームの位置、フレームの装飾、フレームの大きさなど)を重視する場合には最適であるが、これに限られない。
(11)アルバム作成に用いる配置パターン
実施の形態では、最高の順位の配置パターンを用いてアルバムを作成するとしたが(図26:S265)、これに限られない。例えば、上位3位までの配置パターンを用いて、対応する3つのアルバムを作成し、作成したアルバムの一覧をユーザに表示するとしてもよい。
(12)単一枠の例
実施の形態では、複数の枠(枠セット)に挿入する画像の組み合わせにより評価値を算出するとしているが、必ずしもテンプレートの枠のすべてが枠セットのみから構成されるわけではない。
(13)枠と評価項目の関係
実施の形態では主に1つの枠セットに1つの評価項目を対応付けるとして説明したが、1つの枠セットに複数の評価項目を対応付けるとしても構わない。また、(12)で説明した単一枠に複数の評価項目を対応付けるとしてもよい。
(14)配置パターン数の削減
実施の形態では、テンプレートの枠に画像を配置した際の配置パターンを全通り生成するとしたが(図26のS264など)、評価値算出に関する処理負荷を軽減するために、評価値を算出する配置パターンの数を減らすとしてもよい。
(15)数式の例
以下、実施の形態で説明した値を評価するための数式の例について説明する。
(16)実施の形態では、画像による作品としてアルバムを例として説明したが、ブック形式のアルバムに限らず、写真が遷移するスライドショー形式や写真が多彩なアニメーションで動作するムービー形式で、最適な画像組合せを選択して作品化するとしてもよい。
(17)本実施の形態では、記憶媒体の例として、SDメモリカードを例に挙げて説明したが、記録媒体であればこれに限られず、スマートメディア、コンパクトフラッシュ(登録商標)、メモリースティック(登録商標)、SDメモリーカード、マルチメディアカード、CD-R/RW、DVD±R/RW、DVD-RAM、HD-DVD、BD(Blu-ray Disc)の記録媒体などを用いてもよい。
(18)実施の形態の画像評価装置は、典型的には集積回路であるLSI(Large Scale Integration)として実現されてよい。各回路を個別に1チップとしてもよいし、全ての回路又は一部の回路を含むように1チップ化されてもよい。ここでは、LSIとして記載したが、集積度の違いにより、IC(Integrated Circuit)、システムLSI、スーパLSI、ウルトラLSIと呼称されることもある。また、集積回路化の手法はLSIに限るものではなく、専用回路又は汎用プロセッサで実現してもよい。LSI製造後にプログラム化することが可能なFPGA(FieldProgrammable Gate Array)、LSI内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサを利用してもよい。
(19)プログラム
実施の形態で示した画像評価に係る処理をコンピュータ等の各種機器のプロセッサ、及びそのプロセッサに接続された各種回路に実行させるためのプログラムコードからなる制御プログラムを、記録媒体に記録すること、又は各種通信路を介して流通させ頒布することもできる。
<補足2>
本実施の形態は、以下の態様を含むものである。
<参考文献>
(1)参考文献1
Gabor特徴の情報量による重みづけマッチングを用いた顔認識
堀田 一弘(埼玉大学(日本学術振興会特別研究員))
電子情報通信学会技術研究報告. HIP,
ヒューマン情報処理 100(34) pp.31-38 20000504
(2)参考文献2
AJohn C. Platt, Mary Czerwinski, Brent A. Field: PhotoTOC: Automatic Clustering for Browsing Personal Photographs, Fourth IEEEPacific Rim Conference on Multimedia (2003)
(3)参考文献3
Swain,M.J. and Ballard,D.H.:"Color Indexing",IJCV,7,pp.11-32(1991)
2 記憶媒体
3 画像評価装置
4 表示ディスプレイ
10 画像取得部
20 オブジェクト抽出部
21 オブジェクト特徴量抽出部
22 オブジェクトクラスタリング部
30 イベント抽出部
31 撮影日時抽出部
32 イベントクラスタリング部
40 背景抽出部
41 背景特徴量抽出部
50 画像評価部
51 オブジェクト重要度算出部
52 背景重要度算出部
60 アルバム情報選択部
61 イベントクラスタ選択部
62 テンプレート選択部
70 画像セット評価部
71 オブジェクト紹介度算出部
72 場面遷移度算出部
73 オブジェクト行動度算出部
74 場面全体像度算出部
80 配置パターン評価部
90 記憶部
91 テンプレート情報記憶部
92 オブジェクトクラスタ情報記憶部
93 オブジェクト重要度記憶部
94 背景特徴量記憶部
95 背景重要度記憶部
96 イベントクラスタ情報記憶部
97 画像セット情報記憶部
98 配置パターン情報記憶部
100 表示制御部
Claims (21)
- 画像を配置するためのN個(Nは2以上の自然数)の枠、前記N個の中から複数個の枠を組み合わせて構成される1以上の枠セット、および各枠セットに配置される画像それぞれの特徴量に係る評価項目、を有するテンプレートを記憶するテンプレート記憶手段と、
複数の画像を取得する取得手段と、
取得された複数の画像の中からN枚の画像を選択して前記N個の枠に配置した場合の配置パターンを複数生成し、生成した配置パターン毎の評価値を前記評価項目に基づいて算出する評価手段と、
前記評価手段により算出された配置パターン毎の評価値を記憶する評価値記憶手段とを備え、
前記評価手段は、前記評価値記憶手段に記憶された評価値の中から、最上位の評価値に対応する配置パターンにて配置したN枚の画像を特定する
ことを特徴とする画像評価装置。 - 前記枠セットに配置される画像それぞれの特徴量に係る評価項目とは、
枠セットに配置される画像が含むオブジェクトの紹介性、または、枠セットに配置される画像の場面の移り変わりを含む
ことを特徴とする請求項1に記載の画像評価装置。 - 前記複数の画像に含まれるオブジェクトをクラスタリングするオブジェクトクラスタリング手段を備え、
前記評価手段は、前記枠セットに配置される画像において、各画像が含むオブジェクトのオブジェクトクラスタが、相違しているほど、前記オブジェクトの紹介性を高く評価する
ことを特徴とする請求項2に記載の画像評価装置。 - クラスタリングされた各オブジェクトクラスタの重要度を算出するオブジェクト重要度算出手段を備え、
前記評価手段は、前記枠セットに配置される画像が含むオブジェクトのオブジェクトクラスタの重要度がより高いほど、前記オブジェクトの紹介性を高く評価する
ことを特徴とする請求項3に記載の画像評価装置。 - 前記複数の画像のそれぞれから、背景の特徴量を抽出する抽出手段を備え、
前記評価手段は、前記枠セットに配置される画像において、各画像の背景の特徴量の非類似性または画像間の撮影間隔の大きさに基づいて前記場面の移り変わりを評価する
ことを特徴とする請求項2に記載の画像評価装置。 - 前記複数の画像のそれぞれについて、背景の重要度を算出する背景重要度算出手段を備え、
前記評価手段は、前記枠セットに配置される画像において、各画像の背景の重要度が高いほど、前記場面の移り変わりを高く評価する
ことを特徴とする請求項5に記載の画像評価装置。 - 前記複数の画像に含まれるオブジェクトをクラスタリングするオブジェクトクラスタリング手段を備え、
前記枠セットに配置される画像それぞれの特徴量に係る評価項目とは、背景が類似する画像間における同一のオブジェクトクラスタに係る変化度
を含むことを特徴とする請求項2に記載の画像評価装置。 - 前記複数の画像のそれぞれから、背景の特徴量を抽出する抽出手段を備え、
前記評価手段は、前記枠セットに配置される画像において、各画像が含むオブジェクトクラスタが同一であって、各画像の背景の特徴量が類似しているほど、前記変化度を高く評価する
ことを特徴とする請求項7に記載の画像評価装置。 - クラスタリングされた各オブジェクトクラスタの重要度を算出するオブジェクト重要度算出手段と、
前記複数の画像のそれぞれについて、背景の重要度を算出する背景重要度算出手段とを備え、
前記評価手段は、前記枠セットに配置される画像が含むオブジェクトのオブジェクトクラスタの重要度、および、前記枠セットに配置される画像の背景の重要度の少なくとも一方が高いほど、前記変化度をより高く評価する
ことを特徴とする請求項8に記載の画像評価装置。 - 前記複数の画像のそれぞれから、オブジェクトが占める占有領域を計算する計算手段を備え、
前記評価手段は、前記枠セットに配置される画像において、各画像が含むオブジェクトクラスタが同一のオブジェクトが占める占有領域が、相違しているほど、前記変化度を高く評価する
ことを特徴とする請求項7に記載の画像評価装置。 - クラスタリングされた各オブジェクトクラスタの重要度を算出するオブジェクト重要度算出手段と、
前記複数の画像のそれぞれについて、背景の重要度を算出する背景重要度算出手段とを備え、
前記評価手段は、前記枠セットに配置される画像が含むオブジェクトのオブジェクトクラスタの重要度、および、前記枠セットに配置される画像の背景の重要度の少なくとも一方が高いほど、前記変化度をより高く評価する
ことを特徴とする請求項10に記載の画像評価装置。 - 前記複数の画像のそれぞれについて、背景の重要度を算出する背景重要度算出手段を備え、
前記評価手段は、前記枠セットに配置される画像において、各画像の背景の重要度に基づいて前記評価値を算出する
ことを特徴とする請求項1に記載の画像評価装置。 - 前記複数の画像のそれぞれから、背景の特徴量を抽出する抽出手段を備え、
前記背景重要度算出手段は、
算出対象となる一の画像と撮影日時が近い他の画像とのそれぞれの背景の特徴量との類似性、および、前記一の画像の背景の特徴量と、重要な人物が写る画像の背景の特徴量の類似性に基づいて、背景の重要度を算出する
ことを特徴とする請求項12に記載の画像評価装置。 - 評価値の算出の基にした枠セットに係る評価項目に対応する情報を、当該枠セットに配置される画像に関連付けて表示する表示制御手段を備える
ことを特徴とする請求項1に記載の画像評価装置。 - 前記テンプレートにおいて、前記1以上の枠セットを構成する枠は、前記N個の枠のすべてを網羅している
ことを特徴とする請求項1に記載の画像評価装置。 - 前記評価手段は、取得された複数の画像の中から、絞り込みを行い、絞り込み後の画像を前記選択の対象とする
ことを特徴とする請求項1に記載の画像評価装置。 - 前記複数の画像のそれぞれについて、背景の重要度を算出する背景重要度算出手段を備え、
前記評価手段は、前記複数の画像それぞれの背景の重要度に基づいて、前記絞り込みを行う
ことを特徴とする請求項16に記載の画像評価装置。 - 前記評価手段は、前記複数の画像を複数のシーンに分割し、分割したシーン毎の重要度に基づいて、前記絞り込みを行う
ことを特徴とする請求項16に記載の画像評価装置。 - 複数の画像を取得する取得ステップと、
テンプレート記憶手段に記憶されたテンプレートを参照する参照ステップと、
ここで、前記テンプレートは、画像を配置するためのN個(Nは2以上の自然数。)の枠、前記N個の中から複数個の枠を組み合わせて構成される1以上の枠セット、および各枠セットに配置される画像それぞれの特徴量に係る評価項目、を有し、
さらに、前記参照されたテンプレートに基づいて取得された複数の画像の中からN枚の画像を選択し、選択した前記N個の枠に配置した場合の配置パターンを複数生成し、生成した配置パターン毎の評価値を前記評価項目に基づいて算出する評価ステップと、
前記評価ステップにより算出された配置パターン毎の評価値を記憶する評価値記憶ステップと、
前記評価値記憶ステップに記憶された評価値の中から、最上位の評価値に対応する配置パターンにて配置したN枚の画像を特定する特定ステップとを含む
ことを特徴とする画像評価方法。 - コンピュータに画像評価処理を実行させるプログラムであって、
前記コンピュータに、
複数の画像を取得する取得ステップと、
テンプレート記憶手段に記憶されたテンプレートを参照する参照ステップと、
ここで、前記テンプレートは、画像を配置するためのN個(Nは2以上の自然数。)の枠、前記N個の中から複数個の枠を組み合わせて構成される1以上の枠セット、および各枠セットに配置される画像それぞれの特徴量に係る評価項目、を有し、
さらに、前記参照されたテンプレートに基づいて取得された複数の画像の中からN枚の画像を選択し、選択した前記N個の枠に配置した場合の配置パターンを複数生成し、生成した配置パターン毎の評価値を前記評価項目に基づいて算出する評価ステップと、
前記評価ステップにより算出された配置パターン毎の評価値を記憶する評価値記憶ステップと、
前記評価値記憶ステップに記憶された評価値の中から、最上位の評価値に対応する配置パターンにて配置したN枚の画像を特定する特定ステップを実行させる、
ことを特徴とするプログラム。 - 画像を配置するためのN個(Nは2以上の自然数。)の枠、前記N個の中から複数個の枠を組み合わせて構成される1以上の枠セット、および各枠セットに配置される画像それぞれの特徴量に係る評価項目、を有するテンプレートを記憶するテンプレート記憶手段と、
複数の画像を取得する取得手段と、
取得された複数の画像の中からN枚の画像を選択して前記N個の枠に配置した場合の配置パターンを複数生成し、生成した配置パターン毎の評価値を前記評価項目に基づいて算出する評価手段と、
前記評価手段により算出された配置パターン毎の評価値を記憶する評価値記憶手段と備え、
前記評価手段は、前記評価値記憶手段に記憶された評価値の中から、最上位の評価値に対応する配置パターンにて配置したN枚の画像を特定する
ことを特徴とする集積回路。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012557818A JP5908849B2 (ja) | 2011-02-17 | 2012-02-06 | 画像評価装置、画像評価方法、プログラム、集積回路 |
US13/639,024 US8885944B2 (en) | 2011-02-17 | 2012-02-06 | Image evaluation device, image evaluation method, program, integrated circuit |
CN201280001132.5A CN102859985B (zh) | 2011-02-17 | 2012-02-06 | 图像评价装置、图像评价方法及集成电路 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011031983 | 2011-02-17 | ||
JP2011-031983 | 2011-02-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012111275A1 true WO2012111275A1 (ja) | 2012-08-23 |
Family
ID=46672222
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/000787 WO2012111275A1 (ja) | 2011-02-17 | 2012-02-06 | 画像評価装置、画像評価方法、プログラム、集積回路 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8885944B2 (ja) |
JP (1) | JP5908849B2 (ja) |
CN (1) | CN102859985B (ja) |
WO (1) | WO2012111275A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015053542A (ja) * | 2013-09-05 | 2015-03-19 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
KR20200072238A (ko) * | 2018-12-12 | 2020-06-22 | 인하대학교 산학협력단 | 동영상 내 인물 영역 추출 장치 |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120118383A (ko) * | 2011-04-18 | 2012-10-26 | 삼성전자주식회사 | 이미지 보정 장치 및 이를 이용하는 이미지 처리 장치와 그 방법들 |
US9177201B2 (en) * | 2012-05-29 | 2015-11-03 | Panasonic Intellectual Property Management Co., Ltd. | Image evaluation device, image evaluation method, program, and integrated circuit |
JP6019815B2 (ja) * | 2012-06-28 | 2016-11-02 | 富士通株式会社 | データ処理システム、アプリケーション提示方法、及びプログラム |
US9537706B2 (en) | 2012-08-20 | 2017-01-03 | Plentyoffish Media Ulc | Apparatus, method and article to facilitate matching of clients in a networked environment |
JP2014139734A (ja) * | 2013-01-21 | 2014-07-31 | Sony Corp | 情報処理装置および方法、並びにプログラム |
US11568008B2 (en) | 2013-03-13 | 2023-01-31 | Plentyoffish Media Ulc | Apparatus, method and article to identify discrepancies between clients and in response prompt clients in a networked environment |
JP5802255B2 (ja) | 2013-03-13 | 2015-10-28 | 富士フイルム株式会社 | レイアウト編集装置、レイアウト編集方法およびプログラム |
US10282075B2 (en) | 2013-06-24 | 2019-05-07 | Microsoft Technology Licensing, Llc | Automatic presentation of slide design suggestions |
US9672289B1 (en) * | 2013-07-23 | 2017-06-06 | Plentyoffish Media Ulc | Apparatus, method and article to facilitate matching of clients in a networked environment |
US9870465B1 (en) | 2013-12-04 | 2018-01-16 | Plentyoffish Media Ulc | Apparatus, method and article to facilitate automatic detection and removal of fraudulent user information in a network environment |
JP6033821B2 (ja) * | 2014-09-12 | 2016-11-30 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
AU2015203570A1 (en) * | 2015-06-26 | 2017-01-19 | Canon Kabushiki Kaisha | Method, system and apparatus for segmenting an image set to generate a plurality of event clusters |
US10534748B2 (en) | 2015-11-13 | 2020-01-14 | Microsoft Technology Licensing, Llc | Content file suggestions |
US10528547B2 (en) | 2015-11-13 | 2020-01-07 | Microsoft Technology Licensing, Llc | Transferring files |
US9824291B2 (en) | 2015-11-13 | 2017-11-21 | Microsoft Technology Licensing, Llc | Image analysis based color suggestions |
JP6595956B2 (ja) * | 2016-07-04 | 2019-10-23 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP6750094B2 (ja) * | 2017-03-15 | 2020-09-02 | 富士フイルム株式会社 | 画像評価装置,画像評価方法および画像評価プログラム |
JP6885896B2 (ja) * | 2017-04-10 | 2021-06-16 | 富士フイルム株式会社 | 自動レイアウト装置および自動レイアウト方法並びに自動レイアウトプログラム |
CN117112640B (zh) * | 2023-10-23 | 2024-02-27 | 腾讯科技(深圳)有限公司 | 一种内容排序方法以及相关设备 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007143093A (ja) * | 2005-10-18 | 2007-06-07 | Fujifilm Corp | アルバム作成装置、アルバム作成方法、およびアルバム作成プログラム |
JP2008052326A (ja) * | 2006-08-22 | 2008-03-06 | Fujifilm Corp | 電子アルバム生成装置、電子アルバム生成方法、および、そのプログラム |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4315344B2 (ja) | 2003-11-27 | 2009-08-19 | 富士フイルム株式会社 | 画像編集装置および方法並びにプログラム |
JP5284994B2 (ja) * | 2010-02-09 | 2013-09-11 | 株式会社沖データ | 画像処理装置 |
-
2012
- 2012-02-06 WO PCT/JP2012/000787 patent/WO2012111275A1/ja active Application Filing
- 2012-02-06 CN CN201280001132.5A patent/CN102859985B/zh active Active
- 2012-02-06 JP JP2012557818A patent/JP5908849B2/ja active Active
- 2012-02-06 US US13/639,024 patent/US8885944B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007143093A (ja) * | 2005-10-18 | 2007-06-07 | Fujifilm Corp | アルバム作成装置、アルバム作成方法、およびアルバム作成プログラム |
JP2008052326A (ja) * | 2006-08-22 | 2008-03-06 | Fujifilm Corp | 電子アルバム生成装置、電子アルバム生成方法、および、そのプログラム |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015053542A (ja) * | 2013-09-05 | 2015-03-19 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
KR20200072238A (ko) * | 2018-12-12 | 2020-06-22 | 인하대학교 산학협력단 | 동영상 내 인물 영역 추출 장치 |
KR102179591B1 (ko) | 2018-12-12 | 2020-11-17 | 인하대학교 산학협력단 | 동영상 내 인물 영역 추출 장치 |
Also Published As
Publication number | Publication date |
---|---|
US8885944B2 (en) | 2014-11-11 |
CN102859985A (zh) | 2013-01-02 |
JPWO2012111275A1 (ja) | 2014-07-03 |
US20130028521A1 (en) | 2013-01-31 |
JP5908849B2 (ja) | 2016-04-26 |
CN102859985B (zh) | 2015-08-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5908849B2 (ja) | 画像評価装置、画像評価方法、プログラム、集積回路 | |
US8929669B2 (en) | Image evaluation apparatus that calculates an importance degree of each of a plurality of images | |
US8917943B2 (en) | Determining image-based product from digital image collection | |
CN103201769B (zh) | 图像处理装置、图像处理方法、集成电路 | |
US9934450B2 (en) | System and method for creating a collection of images | |
CN101755303B (zh) | 采用语义分类器的自动题材创建 | |
JP6323465B2 (ja) | アルバム作成プログラム、アルバム作成方法およびアルバム作成装置 | |
Obrador et al. | Supporting personal photo storytelling for social albums | |
US20120082378A1 (en) | method and apparatus for selecting a representative image | |
US20120294514A1 (en) | Techniques to enable automated workflows for the creation of user-customized photobooks | |
US20130050747A1 (en) | Automated photo-product specification method | |
US9336442B2 (en) | Selecting images using relationship weights | |
CN104915634A (zh) | 基于人脸识别技术的图像生成方法和装置 | |
US8831360B2 (en) | Making image-based product from digital image collection | |
CN106454064A (zh) | 图像处理装置以及图像处理方法 | |
CN106445424A (zh) | 信息处理方法及信息处理装置 | |
US10074039B2 (en) | Image processing apparatus, method of controlling the same, and non-transitory computer-readable storage medium that extract person groups to which a person belongs based on a correlation | |
Vonikakis et al. | A probabilistic approach to people-centric photo selection and sequencing | |
JP2006079457A (ja) | 電子アルバム表示システム、電子アルバム表示方法、電子アルバム表示プログラム、画像分類装置、画像分類方法、及び画像分類プログラム | |
US9117275B2 (en) | Content processing device, integrated circuit, method, and program | |
US20110044530A1 (en) | Image classification using range information | |
CN112035685B (zh) | 相册视频生成方法、电子设备和存储介质 | |
US20110304644A1 (en) | Electronic apparatus and image display method | |
KR101545386B1 (ko) | 포토앨범 제작 방법 및 시스템 | |
CN105677696A (zh) | 检索设备和检索方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280001132.5 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012557818 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13639024 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12746842 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12746842 Country of ref document: EP Kind code of ref document: A1 |