WO2012153744A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations Download PDF

Info

Publication number
WO2012153744A1
WO2012153744A1 PCT/JP2012/061788 JP2012061788W WO2012153744A1 WO 2012153744 A1 WO2012153744 A1 WO 2012153744A1 JP 2012061788 W JP2012061788 W JP 2012061788W WO 2012153744 A1 WO2012153744 A1 WO 2012153744A1
Authority
WO
WIPO (PCT)
Prior art keywords
still image
still
image pair
relationship
relevance
Prior art date
Application number
PCT/JP2012/061788
Other languages
English (en)
Japanese (ja)
Inventor
真澄 石川
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Publication of WO2012153744A1 publication Critical patent/WO2012153744A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00129Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a display device, e.g. CRT or LCD monitor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • H04N1/00458Sequential viewing of a plurality of images, e.g. browsing or scrolling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/21Intermediate information storage
    • H04N1/2104Intermediate information storage for one or a few pictures
    • H04N1/2112Intermediate information storage for one or a few pictures using still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8227Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0084Digital still camera

Definitions

  • the present invention relates to an information processing device, an information processing method, and an information processing program, and more particularly to an information processing device, an information processing method, and an information processing program for determining a still image presentation method.
  • Patent Document 1 is a slide show generation technique that realizes a presentation time according to the content of each still image. Specifically, the presentation time of the still image is to be controlled by the number of faces included in the still image.
  • the technique of Patent Document 2 controls the presentation time according to the still image shooting time when generating a slide show.
  • the presentation time of continuous still images in a slide show is determined based on the difference in shooting time of still images.
  • the technique of Patent Document 3 extracts the degree of association between images and reproduces a slide show while giving a visual effect based on the degree of association (paragraph 0024).
  • This technique acquires two images to be compared and extracts the degree of association for each image (paragraph 0063).
  • This technique provides a visual effect for images with low relevance when the relevance is weak and for images with high relevance when the relevance is strong (paragraph 0074).
  • Patent Literatures 1, 2, and 3 determine a still image presentation method using information of at least one still image. Therefore, these techniques represent changes in the contents of two consecutive still images by changes in the presentation method. However, these techniques do not change the presentation method when the relationship between two consecutive still images does not change.
  • the objective of this invention is providing the information processing apparatus which solves the above-mentioned subject.
  • an information processing apparatus in a still image group including at least three still images, has a first relationship between still images of a first still image pair included in the still image group.
  • the comparison means for comparing the second relevance between the still images of the second still image pair, and the presentation method of the first still image pair are specified based on the change from the first relevance to the second relevance
  • the presentation method determination means for determining the second information specifying the presentation method of the second still image pair from the first information
  • generating means for generating a slide show including the second still image pair.
  • an information processing method includes, in a still image group including at least three still images, a first association between still images of a first still image pair included in the still image group.
  • the first still image pair is compared with the second relationship between the still images of the second still image pair, and the presentation method of the first still image pair is specified based on the change from the first relationship to the second relationship.
  • an information processing program causes a computer to select a first still image of a pair of first still images included in a still image group in a still image group including at least three still images.
  • Comparison means for comparing the relevance and the second relevance between the still images of the second still image pair, and presentation of the first still image pair based on a change from the first relevance to the second relevance
  • the presentation method determining means for determining the second information specifying the presentation method of the second still image pair, and the presentation method determined in the determining step, the first still image It is operated as a generating means for generating a slide show including a pair and the second still image pair.
  • the information processing apparatus 100 is an apparatus that generates a slide show including at least three still images.
  • the information processing apparatus 100 includes an association comparison unit (comparison unit) 101, a presentation method determination unit 102, and a slide show generation unit (generation unit) 103.
  • the relevance comparison unit 101 includes a first relevance between still images of the first still image pair included in the still image group and a still image of the second still image pair in the still image group including at least three still images. The second relevance between each other is compared.
  • the presentation method determination unit 102 identifies the presentation method of the second still image pair from the first information that identifies the presentation method of the first still image pair based on the change from the first relevance to the second relevance. The second information is determined.
  • the slide show generation unit 103 generates a slide show including the first still image pair and the second still image pair based on the presentation method determined by the presentation method determination unit 102. According to the above configuration, it is possible to generate a slide show that presents still images according to a change in relevance between still images.
  • the information processing apparatus 200 includes an image input unit 210 that inputs image information from an imaging device 250 such as a digital camera or a digital video camera, a relevance comparison unit (comparison unit) 201 that compares relevance of input images, A relevance determination unit 204 that determines gender.
  • an imaging device 250 such as a digital camera or a digital video camera
  • a relevance comparison unit (comparison unit) 201 that compares relevance of input images
  • a relevance determination unit 204 that determines gender.
  • the information processing apparatus 200 also includes a presentation method determination unit 202 that determines the display length of each image included in the slide show, the effect at the time of switching images, the BGM (Background Music) being displayed, the jingle at the time of switching, and the like. Further, the information processing apparatus 200 includes a slide show generation unit (generation unit) 203 that generates a slide show by combining still images input by the presentation method determined by the presentation method determination unit 202. The information processing apparatus 200 is also connected to a display 260 for displaying a stationary slide show.
  • the image information input by the image input unit 210 includes an image ID (identifier) for identifying a still image, a presentation order in a slide show, and pixel information of the still image.
  • the image information may include meta information describing a subject, a shooting location, and a shooting time shown in a still image, and sensor information such as GPS (Global Positioning System).
  • the relevancy determination unit 204 determines the first relevance based on the commonality of objects represented in two still images included in the first still image pair.
  • the relevance determination unit 204 determines the second relevance based on the commonality of objects represented in two still images included in the second still image pair.
  • the first still image pair and the second still image pair are pairs of continuous still images, respectively.
  • the relevancy determination unit 204 inputs the image ID of the still image and the relevance flag to the relevance comparison unit 201 as the image relevance information.
  • the relevancy determination unit 204 may input pixel information in addition to the above as image relevance information.
  • the relevance flag is data representing a relevance type existing between a current still image and a still image presented thereafter among relevance types defined in advance. Alternatively, the relevance flag is data indicating that no relevance type exists between these images (no relevance).
  • a flag 1 is set to the relevance flags of all relevance types existing between a certain still image and a subsequent still image.
  • a flag 0 is set for the relationship flag of the relationship type that does not exist.
  • the relevance flag may be set with any numerical value that has meaning depending on the relevance type.
  • the relationship comparison unit 201 compares the first relationship between the still images of the first still image pair included in the still image group and the second relationship between the still images of the second still image pair. Specifically, the relevance comparison unit 201 determines the relevance between pairs of still images that are consecutive in the slide show.
  • the still image group includes continuous first, second, and third still images.
  • the first still image pair is a pair of a first still image and a second still image.
  • the second still image pair is a pair of the second still image and the third still image.
  • the presentation method determination unit 202 presents each still image based on the image relevance information input from the relevance comparison unit 201, the image information input from the image input unit 210, and a pre-registered presentation rule. Decide how. Then, the presentation method determination unit 202 inputs the presentation method information to the slide show generation unit 203.
  • the slide show generation unit 203 generates a slide show by combining still images based on the determined presentation method.
  • the relevancy determination unit 204 determines relevance based on the commonality (identity) of objects represented in two still images included in the still image pair.
  • the relationship 1 is the relationship determined in this way.
  • the identity can be determined by the commonality of the feature quantities derived from a plurality of still images included in the still image group.
  • the relevance 1 is a relevance indicating that the objects shown in the still image pairs continuous in the slide show are the same.
  • the relevance determination unit 204 sets 1 in the relevance flag for relevance 1 if it is the same, and 0 if it is not the same.
  • the relevancy determination unit 204 can determine the identity of the target based on the similarity of the target area images detected from the still images.
  • the target region is a region on a still image of a target having a certain image-like pattern such as a stationary object such as a tree or a building, or a moving body such as a human or an animal.
  • the target area may be a partial area in the still image.
  • the target area may be the entire still image.
  • the relevancy determination unit 204 detects a target region from a pair of still images that are continuous in the slide show.
  • the relevancy determination unit 204 determines the identity of the target based on the similarity of the target area.
  • the relevancy determination unit 204 may determine the identity based on the similarity between the target areas detected from all the still images included in the slide show.
  • the relevance determination unit 204 groups all detected target areas based on similarity.
  • the relevance determination unit 204 determines that the target areas are the same when the target areas detected from adjacent still image pairs belong to the same group.
  • the detection method of the target area is divided into a detection method for detecting a specific target registered in advance and a detection method for detecting a general target that is not registered.
  • the image data of each registered target may be used as a template.
  • the relevancy determination unit 204 may scan the input image with templates converted into various resolutions.
  • the relevancy determination unit 204 may detect a region having a small difference in pixel values at the same position as the template as a corresponding target region.
  • the relevancy determination unit 204 may extract an image feature amount expressing color, texture, and shape from each partial region of the input image.
  • the relevancy determination unit 204 may set a partial region having an image feature amount similar to the registered image feature amount of each target as a corresponding target region.
  • the specific target is a person, there is a method of using information obtained from the entire face.
  • a method of storing an image showing various faces as a template and determining that a face exists in the input image when the difference between the input image and the template is equal to or smaller than a threshold value it is conceivable to store a model combining color information such as skin color, edge direction and density in advance, and determine that a face exists when an area similar to the model is detected from an input frame.
  • the face detection method using the characteristics of the luminance distribution that the cheeks and forehead are bright and the eyes and mouth are low, and the face is detected using the face symmetry and skin color area and position.
  • the method of performing etc. is mentioned.
  • a method for statistically learning the feature value distribution obtained from a large number of human face and non-face learning samples and determining whether the feature value obtained from the input image belongs to the face or non-face distribution. examples include a method using a neural network, a support vector machine, an AdaBoost method, and the like.
  • detecting a general target for example, Normalized Cut, Saliency Map, Depth of Field (DoF), or the like may be used.
  • Normalized Cut is a method of dividing an image into a plurality of regions. For details, see Jianbo Shi and Jitendra Malik, “Normalized Cuts and Image Segmentation”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 8. August 2000 has a disclosure.
  • the relevancy determination unit 204 may detect a region located in the center of the screen among regions divided by Normalized Cut as a target region.
  • Saliency Map is a method of calculating an object region in an image from visual attention. For SaliencyMap, see L. Itti, C.I. Koch and E.M. Niebur, “A Model of Saliency-based Visual Attention for Rapid Scene Analysis,” IEEE Trans. Pattern Analysis and Machine Intelligence, Vol. 20, no. 11, pp.
  • the relevancy determination unit 204 may detect, as a target region, a region for which a high importance level has been calculated by the Saliency Map. Dof is a method based on the characteristic that the target edge existing within the depth of field is not blurred and the edge outside the depth of field is blurred. For details, see 3Du-Ming Tsai, Hu-Jong Wang, “Segmenting focused objects in complex visual images”, Pattern Recognition Letters, Vol. 19, pp. 929 940, 1998. There is a disclosure. The relevancy determination unit 204 may calculate the amount of blur based on the thickness of the edge, combine edges with less blur, and detect a focused area as a target area.
  • the relevancy determination unit 204 is a position in a still image or high visibility (lighting conditions, orientation, angle, position on the screen, hiding by another object, blur, facial expression (in the case of a person), etc.
  • One evaluation area may be detected for each still image based on an evaluation value indicating the quality of the reflection based on the above or the appearance frequency of a plurality of images.
  • the relevancy determination unit 204 may combine a plurality of detected target areas into one target area.
  • the relevancy determination unit 204 may extract the image feature amount, and calculate the similarity between the target regions based on a scale that calculates a higher value as the difference in the image feature amount is smaller.
  • the relevancy determination unit 204 can calculate an image feature amount based on image information such as a color, an edge, and a texture detected from the target region. Alternatively, the relevancy determination unit 204 may detect local feature points such as SIFT (Scale-Invariant Feature Transform) from each target image region. Then, the relevance determination unit 204 may associate feature points between image regions. The relevancy determination unit 204 may use a scale that calculates a higher value as the number of associated feature points is larger or the positional relationship between the associated feature points is similar between images.
  • SIFT Scale-Invariant Feature Transform
  • the presentation method determination unit 202 selects the second presentation method information from the first presentation method information so that the presentation method in the second still image pair changes, similarly to the change in the presentation method in the first still image pair.
  • Determine presentation method information For example, when there is no relevance change, the presentation method determination unit 202 determines the presentation method information so that the presentation time changes in the second still image pair, similarly to the change in the presentation time in the first still image pair.
  • the presentation method information is data indicating the presentation method of each still image.
  • the presentation method information includes an image ID and a presentation time.
  • the presentation method information may include an effect, BGM, audio jingle, and video jingle.
  • the presentation rule is a rule that defines a method of presenting a still image according to the relevance type. It is assumed that the presentation method determination unit 202 holds a parameter that defines each presentation time of continuous still image pairs as a presentation rule. In addition to the presentation time, the presentation method determination unit 202 may hold control parameters related to effects, BGM, and jingles (short video, music, and sound effects) inserted between still images. In addition, the presentation rule may define a presentation method in a case where no relevance type exists in a continuous still image pair. In the present embodiment, the presentation method determination unit 202 determines the presentation time of a still image pair based on the identity of objects included in successive still image pairs.
  • the presentation method determination unit 202 sets the presentation time of the still image to be presented first to the initial value Ts.
  • the presentation method determination unit 202 determines the presentation time of the subsequent still image based on Ts. If the objects included in the continuous still image pairs are not the same, the presentation method determination unit 202 determines the subsequent presentation time independently of the previous still image presentation time.
  • the presentation method determination unit 202 may set the subsequent presentation time to, for example, the initial value Ts.
  • the presentation method determination unit 202 may set the subsequent presentation time to a random value within a specified range. Note that the presentation method determination unit 202 may set Tp as the presentation time of a still image with high visibility among the group of still images obtained by photographing the same object.
  • the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Tp. In addition, the presentation method determination unit 202 may set the presentation time of the next still image in which the presentation time of the still image is equal to or less than Tq among the still image groups obtained by photographing the same target as the initial value Ts. And the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Ts. In addition, the presentation method determination unit 202 may set the presentation time of the still image presented last among the still image groups obtained by photographing the same target as the initial value Ts. The presentation method determination unit 202 may calculate the values of Ts and Tp according to the number of images to be presented in consideration of the preset presentation time of the entire slide show.
  • the presentation method determination unit 202 calculates the presentation time of the subsequent still image by multiplying the parameter a by the presentation time of a certain still image.
  • the presentation time of the subsequent still images 302 to 305 is expressed by the following equation (1).
  • the visibility evaluation value of the still image 303 facing the front is equal to or greater than the threshold, if the presentation time of the still image 303 is Tp, the presentation time of the subsequent still image is expressed by the following equation (2). .
  • the information processing apparatus 200 can generate a video in which the presentation time of consecutive images changes even if the consecutive images include the same target.
  • the presentation method determination unit 202 determines an effect, a BGM, and a jingle to be inserted between still image pairs based on the identity of objects included in successive still image pairs. For example, when the target included in the pair of still images is the same, the presentation method determination unit 202 has a special effect (such as dissolve or fade) registered in advance as an effect with little visual change when switching still images. Insert. If they are not the same, the presentation method determination unit 202 inserts special effects (DVE (Digital Video Effects) such as page turning and wipe) registered in advance as effects having a large visual change when switching still images.
  • DVE Digital Video Effects
  • the presentation method determination unit 202 gradually shortens (lengthens) the length of the effect jingle.
  • the presentation method determination unit 202 gradually decreases (increases) the volume of the BGM. Further, for example, when the targets included in the continuous still image pairs are the same, the presentation method determination unit 202 plays the same BGM during the presentation of the still image pairs. When the targets included in the continuous still image pairs are not the same, the presentation method determination unit 202 stops the BGM or switches to a different BGM when switching the still images. In addition, the presentation method determination unit 202 may insert jingles between images that do not have identity. Thereby, the still image group which image
  • the presentation rules for controlling the presentation method are rules based on the magnitude relationship or partial relationship for still image pairs with the same target area, and rules based on homogeneity for still image pairs with the same target area. .
  • the rules based on the magnitude relationship, the partial relationship, and the homogeneity will be described in detail in the third and subsequent embodiments.
  • the image input unit 210 inputs the image information of the still image 501 to the relevancy determination unit 204, and proceeds to step S403. If the input still image is a start image, the process returns from step S403 to step S401, and the image input unit 210 inputs the image information of the second still image 502 to the relevancy determination unit 204 (step S403).
  • step S ⁇ b> 405 the relevance determination unit 204 detects a target area from the still images 501 and 502. Assume that in the relevancy determination unit 204, buildings, flowers, and people are registered in advance as targets, and each model is learned. Then, the relevancy determination unit 204 detects a part surrounded by a solid line rectangle as a target area of the building from the still images 501 and 502. The relevancy determination unit 204 extracts image feature amounts from the pixel information of the target region 0 and the target region 1, and determines identity, magnitude relationship, partial relationship, and homogeneity based on the similarity between the regions. Since the target areas 0 and 1 are detected as the types of buildings, it is determined that there is a homogeneity.
  • a broken-line rectangular area on the still image 501 is detected as a common area of the target area 1 and the target area 0. It is determined that the target areas 1 and 0 have a magnitude relationship. Further, since there is no area other than the common area on the target area 0, it is determined that there is no partial relationship. Therefore, the relevance flags between the still image 501 and the still image 502 are 1, ⁇ 1, 0, 1 in the order of identity, magnitude relationship, partial relationship, and homogeneity (step S407).
  • the presentation method determination unit 202 determines the presentation method based on the image ID and the relevance flag as the image relevance information. Since the target areas of the still image 501 and the still image 502 are the same, a rule based on a magnitude relationship or a partial relationship is applied.
  • the presentation method determination unit 202 sets the presentation time of the still image 501 that is the start image to the initial value Ts.
  • the presentation method determination unit 202 sets the presentation time of the still image 502 to a * Ts because the magnitude relationship between the still images 501 and 502 is a small / large relationship. Since the still images 501 and 502 have a magnitude relationship, the presentation method determination unit 202 inserts a dissolve with little visual change as an effect of switching between the still images 501 and 502 (step S409).
  • the slide show generation unit 203 generates a slide show using the still images 501 and 502 with the determined presentation time / effect (step S411). The above steps are performed for all still images (step S413). In FIG.
  • the type of the target area detected from the still image is the target area 601.
  • the relevance flag for each relevance type is 602.
  • the presentation time length and the effect determined by the presentation method determination unit 202 are the presentation time length 603 and the effect 604.
  • the presentation method determination unit 202 controls the presentation method of continuous still images according to the relevance of each other.
  • even when consecutive still images include the same number of face images if there is no relevance in terms of content, it can be shown to the viewer that there is no relevance. .
  • the presentation method may be changed according to any one of the following relevance changes, and any one of the following presentation rules may be adopted: . (Relevance 2. Target size relationship)
  • the relevancy determination unit 204 may determine the relevance based on the magnitude relationship between the objects represented in the two still images included in the still image pair.
  • the relevancy determination unit 204 may determine relevance by changing the size of a region including a specific object registered in advance in two still images included in a still image pair.
  • the relevance 2 is the relevance determined in this way.
  • the “target size relationship” means that the targets included in the still image pairs that are continuous in the slide show are the same, and the area of the target region has a difference greater than a specified value. For example, there is a case where a target is introduced by generating a slide show by combining an image including the periphery of the target and an image obtained by photographing only the target.
  • the relevancy determination unit 204 can determine the magnitude relationship between objects based on the areas of partial areas common to the target areas determined to be the same or the distances between feature points included in the common partial areas.
  • the relevancy determination unit 204 can determine that the larger the distance between feature points is, the larger the object is photographed.
  • the relevancy determination unit 204 may determine between target areas determined to be the same between a pair of still images that are consecutive in a slide show. In this case, the relevancy determination unit 204 determines that the relevance flag for relevance 2 has a larger area of the target area in the next still image than the area of the target area in the still image. 1 is set when it is small, and -1 is set when there is no magnitude relationship.
  • the relevancy determination unit 204 compares the areas of the partial areas common to the target areas determined to be the same among the target areas detected from all the still images included in the slide show or the distance between the feature points, The magnitude relationship may be determined.
  • the relevancy determination unit 204 reduces the same target area smaller than (Smax + 2Smin) / 3 based on the maximum area Smax and the minimum area Smin of the partial areas common to the target areas determined to be the same. That's fine.
  • the relevancy determination unit 204 may set the same target region larger than (Smax + 2Smin) / 3 and smaller than (2Smax + Smin) / 3.
  • the relevancy determination unit 204 may increase the same target area larger than (2Smax + Smin) / 3. In this case, the relevancy determination unit 204 sets the relevance flag to 1 if the target area in a still image and the target area in the next still image have a small-medium or medium-large relationship. .
  • the relevancy determination unit 204 sets 2 if the target area in a certain still image and the target area in the next still image have a small and large relationship.
  • the relevance determination unit 204 sets ⁇ 1 in the relevance flag if the target area in a certain still image and the target area in the next still image have a large-medium or medium-small relationship.
  • the relevancy determination unit 204 sets ⁇ 2 if the target area in a certain still image and the target area in the next still image have a large and small relationship.
  • the relevancy determination unit 204 sets 0 in the relevance flag.
  • the presentation method determination unit 202 changes the presentation method in the same manner assuming that there is no change in relevance when the change of the object from large to small continues. For example, the presentation method determination unit 202 gradually shortens the presentation time at the same time interval. Specifically, the presentation method determination unit 202 controls the presentation method based on the following rules. [Rules according to the target size] (2-1) Rules regarding presentation time The presentation method determination unit 202 determines the presentation time of a still image pair based on the size relationship of objects included in successive still image pairs. For example, the presentation method determination unit 202 sets the presentation time of the first still image to be presented as the initial value Ts among the still image groups having a target size relationship.
  • the presentation method determination unit 202 determines the presentation time of the subsequent still image based on Ts.
  • the presentation method determination unit 202 may set Tp as the presentation time of a still image with high visibility among a group of still images having a target size relationship.
  • the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Tp.
  • the presentation method determination unit 202 may set, as the initial value Ts, the presentation time of the next still image in which the presentation time of the still image is equal to or less than Tq among the still image groups having a magnitude relationship.
  • the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Ts.
  • the presentation method determination unit 202 may set the presentation time of the still image presented last among the still image groups having a magnitude relationship to Ts.
  • the presentation method determination unit 202 may calculate the values of Ts and Tp according to the number of images to be presented from the preset presentation time of the entire slide show.
  • the presentation method determination unit 202 determines the presentation time of the subsequent still image independently of the presentation time of the previous still image. For example, the presentation method determination unit 202 may set the presentation time of the subsequent still image to the initial value Ts.
  • the presentation method determination unit 202 may set the presentation time of the subsequent still image to a random value within a specified range.
  • the relevancy determination unit 204 determines the magnitude relationship between successive still images by comparing the areas between target regions determined to be the same among target regions detected from all the still images included in the slide show. .
  • the presentation method determination unit 202 calculates the presentation time of the next still image by multiplying the presentation time of a certain still image by the relevance flag parameter a.
  • the presentation time of the first still image 701 is the initial value Ts
  • the still images 701 and 702 have a small and medium relationship
  • the still images 702 and 703 have a medium and large relationship
  • the presentation time of the still image 702 is a ⁇ Ts (multiplication of a).
  • the presentation time of the still image 703 is a ⁇ a ⁇ Ts (multiplication of a). Since the relevance flag of the still images 703 and 704 is ⁇ 2, the presentation time of 704 is Ts (a ⁇ a division).
  • the information processing apparatus 200 can generate a video in which the presentation time of consecutive images changes even if the consecutive images include the same target. For this reason, this embodiment has an effect that a slide show with a tempo that does not bore viewers can be generated. (2-2) Rules regarding effects, BGM, and jingles
  • the presentation method determination unit 202 determines an effect, a BGM, and a jingle to be inserted between a pair of still images based on the size relationship of objects included in successive still image pairs.
  • the presentation method determination unit 202 performs special effects (dissolve or fade) registered in advance as effects with little visual change when switching still images. Etc.).
  • special effects dissolve or fade
  • the presentation method determination unit 202 displays special effects (page turning, wipe, etc.) registered in advance as effects having a large visual change when switching still images. DVE) is inserted.
  • the presentation method determination unit 202 plays the same BGM during the presentation of the still image pairs.
  • the presentation method determination unit 202 stops the BGM or switches to a different BGM when switching the still images.
  • the presentation method determination unit 202 may insert jingles between images that do not have a magnitude relationship. Thereby, the still image group which image
  • the target included in a pair of still images does not have a large or small relationship, the image and sound change greatly, so that the viewer notices that the content has changed and concentrates on understanding the content of the slide show. Can do. (Relevance 3.
  • the relevancy determination unit 204 may determine the relevance based on a partial relationship of objects represented in two still images included in the still image pair. That is, the relevancy determination unit 204 may determine whether the object represented in the two still images included in the still image pair is in a relationship between the whole and the part.
  • the relationship 3 is the relationship determined in this way. “In a partial relationship of objects” represents a relationship in which the objects shown in consecutive still image pairs in the target slide show are the same and are images obtained by capturing different target parts. For example, when it is desired to shoot a wide landscape, a large object, or a long object, a case where a whole image is expressed by combining a still image obtained by capturing a part of the object and performing a slide show.
  • the relevancy determination unit 204 sets 1 in the relevance flag for the relevance 3.
  • the relevance determination unit 204 sets 0 in the relevance flag for relevance 3 when the partial relationship is not the target.
  • the relevancy determination unit 204 can determine the target partial relationship based on a partial area (common area) that is common to the target areas determined to be the same in consecutive still images in the slide show. For example, the relevancy determination unit 204 uses one of the target areas as a template. Then, the relevancy determination unit 204 scans the other target area, detects a position with a small difference, and sets the overlapping area as a common area.
  • the relevance determining unit 204 determines that the regions other than the common region of each target region are in a target partial relationship when both regions are equal to or larger than the specified area. Alternatively, the relevancy determination unit 204 may perform the determination based on the relative position of the target area determined to be the same from all the still images included in the slide show. When the change of the object from the whole to the part continues, the presentation method determination unit 202 assumes that there is no change in relevance, and gradually shortens the presentation time, for example, at the same time interval. Specifically, the presentation method determination unit 202 controls the presentation method based on the following rules.
  • the presentation method determination unit 202 determines the presentation time of a still image pair based on the target partial relationship included in the continuous still image pair. For example, the presentation method determination unit 202 sets the presentation time of the first still image to be presented as the initial value Ts among the still image groups in the target partial relationship. Then, the presentation method determination unit 202 determines the presentation time of the subsequent still image based on Ts. In addition, the presentation method determination unit 202 may set Tp as the presentation time of a still image with high visibility among the still image groups in the target partial relationship. And the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Tp.
  • the presentation method determination unit 202 may set, as an initial value Ts, the presentation time of the next still image in which the presentation time of the still image is equal to or less than Tq among the still image groups having a partial relationship. And the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Ts. In addition, the presentation method determination unit 202 may set the presentation time of the last presented image among the still image groups having a partial relationship to Ts. The presentation method determination unit 202 may calculate the values of Ts and Tp according to the number of images to be presented from the preset presentation time of the entire slide show.
  • the presentation method determination unit 202 determines the presentation time of the subsequent still image independently of the presentation time of the previous still image. For example, the presentation method determination unit 202 may set the presentation time of the subsequent still image to the initial value Ts. The presentation method determination unit 202 may set the presentation time of the subsequent still image to a random value within a specified range. A case where a still image obtained by photographing a landscape is reproduced will be described with reference to FIG.
  • the relevancy determination unit 204 includes a partial area that is common among target areas determined to be the same among target areas detected from all still images included in a slide show, and a target area between partial still images. Is determined based on the positional relationship between and.
  • the presentation method determination unit 202 calculates the presentation time of the next still image by multiplying the presentation time of a certain still image by a specified parameter.
  • the relevancy determination unit 204 sets the presentation time of the first still image 801 to the initial value Ts.
  • Still images 801 and 802 and 802 and 803 have a partial relationship, and still images 803 and 804 have no partial relationship.
  • the presentation time of the first still image 801 is the initial value Ts
  • the relevance flag of the still images 801 and 802 is 1, so the presentation time of the still image 802 is a ⁇ Ts.
  • the presentation time of the still image 803 is a 2 Ts.
  • the presentation method determination unit 202 returns the presentation time of the still image 804 to the initial value and sets it to Ts.
  • the parameter a is set to a value between 0 and 1 and the smaller the area of the matching partial region is, the still image 801 presented for the first time for the landscape is presented longer.
  • the other parts are presented at a presentation time corresponding to the amount of information overlapping with the previously presented image.
  • the information processing apparatus 200 can generate a video in which the presentation time of consecutive images changes even if the consecutive images include the same target.
  • the presentation method determination unit 202 determines an effect, a BGM, and a jingle to be inserted between a pair of still images based on a target partial relationship included in a pair of still images. For example, when the targets included in the continuous still image pairs are in a partial relationship, the presentation method determination unit 202 uses special effects (dissolve and fade) that are registered in advance as effects with little visual change when switching still images. Etc.).
  • the presentation method determination unit 202 displays a special effect (page turning, wipe, etc.) registered in advance as an effect having a large visual change when switching still images. DVE) is inserted.
  • the presentation method determination unit 202 plays the same BGM during presentation of still image pairs.
  • the presentation method determination unit 202 stops the BGM or switches to a different BGM when switching the still images.
  • the presentation method determination unit 202 may insert jingles between images that do not have a magnitude relationship.
  • the relevancy determination unit 204 may determine relevance depending on whether or not the objects represented in the two still images included in the still image pair are of the same type.
  • the relationship 4 is the relationship determined in this way. “The objects are of the same type” means that main objects appearing in a pair of still images that are consecutive in the slide show are objects of the same type.
  • the relevancy determination unit 204 determines that the relevance flag for relevance 4 is related to relevance 4 when the target area in a certain still image and the target area in the next still image have the same kind of relationship. 1 is set in the relevance flag. When the target area in a certain still image and the target area in the next still image are different from each other, the relevancy determination unit 204 sets 0 in the relevance flag for the relevance 4. Discrimination of the homogeneity of the object can be realized by a method based on machine learning based on the image data (registered data) of the object belonging to each type for which homogeneity is desired. First, the relevancy determination unit 204 extracts target image feature amounts belonging to various types from registered data.
  • the relevancy determination unit 204 may use a global feature such as a color histogram or an edge histogram as the image feature amount.
  • the relevancy determination unit 204 may use local feature amounts such as HoG (Histograms of Oriented Gradients) and SIFT as the image feature amounts.
  • the relevancy determination unit 204 may perform learning by using a global feature, such as SVM (Signal Value Decomposition), a neural network, or a GMM (Gaussian Mixture Model). Alternatively, the relevancy determination unit 204 may perform learning after converting the feature amount space from the local feature amount like BoW (Bag of Words).
  • BoW BoW
  • the relevance determining unit 204 determines between the image feature amount of each target area and various types of models obtained as a result of learning. Each seeks similarity. Then, the relevancy determination unit 204 determines that the target region is the closest model type that has obtained a similarity equal to or greater than a specified value. The relevancy determination unit 204 determines that the target areas determined to be the same type are the same type. The relevancy determination unit 204 may determine the homogeneity by a method other than the above. When three images including the same type of target are consecutive, the presentation method determination unit 202 assumes that there is no change in relevance and, for example, gradually shortens the presentation time at the same time interval.
  • the presentation method determination unit 202 controls the presentation method based on the following rules. [Rules according to target homogeneity] (4-1) Rules regarding presentation time
  • the presentation method determination unit 202 determines the presentation time of a still image pair based on the homogeneity of objects included in successive still image pairs. For example, the presentation method determination unit 202 sets the presentation time of the still image presented first among the still image groups including the same type of target to the initial value Ts. Then, the presentation method determination unit 202 determines the presentation time of the subsequent still image based on Ts. In addition, the presentation method determination unit 202 may set Tp as the presentation time of a still image with high visibility among a group of still images including the same type of target.
  • the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Tp. In addition, the presentation method determination unit 202 may set the presentation time of the next still image in which the presentation time of the still image is equal to or less than Tq among the still image group including the same type of target as the initial value Ts. And the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Ts. In addition, the presentation method determination unit 202 may set the presentation time of the last presented image among the still image groups including the same type of target to Ts. The presentation method determination unit 202 may calculate the values of Ts and Tp according to the number of images to be presented from the preset presentation time of the entire slide show.
  • the presentation method determination unit 202 determines the presentation time of the subsequent still image independently of the presentation time of the previous still image.
  • the presentation method determination unit 202 may set the presentation time of subsequent still images to, for example, the initial value Ts.
  • the presentation method determination unit 202 may set the presentation time of the subsequent still image to a random value within a specified range.
  • a still image obtained by photographing a flower is reproduced will be described with reference to FIG.
  • the relevancy determination unit 204 determines the homogeneity between consecutive still images by a method based on machine learning.
  • the presentation method determination unit 202 calculates the presentation time of the next still image by multiplying the presentation time of a certain still image by the parameter for the relevance flag.
  • the presentation method determination unit 202 sets the presentation time of the first still image 901 to the initial value Ts.
  • the still images 901 and 902 and the still images 902 and 903 have the same type of relationship, and the still images 903 and 904 have a different type of relationship.
  • the presentation time of the still image 902 is a ⁇ Ts.
  • the presentation time of the still image 903 is a 2 Ts.
  • the presentation method determination unit 202 returns the presentation time of the still image 904 to the initial value and sets it to Ts.
  • the parameter a is set between 0 and 1
  • a still image 901 presented for the first time among the still images including plants is presented for a long time.
  • the subsequent still image is presented in a presentation time that is shorter as the distance from 901 increases.
  • the information processing apparatus 200 can generate a video in which the presentation time of consecutive images changes even if the consecutive images include the same target.
  • this embodiment has an effect that it is possible to generate a slide show with a tempo that does not bore the viewer (this embodiment reproduces images of a plurality of flowers taken in a flower field in order of the same kind of subject in order. Can express that there were many subjects of this type).
  • (4-2) Rules regarding effects, BGM, and jingles The presentation method determination unit 202 determines an effect, a BGM, and a jingle to be inserted between a pair of still images based on the homogeneity of objects included in successive still image pairs. For example, when the target included in the pair of still images is the same type, the presentation method determination unit 202 uses special effects (such as dissolves and fades) registered in advance as effects with little visual change when switching still images.
  • the presentation method determination unit 202 when the target included in the continuous still image pair is different, special effects (such as page turning and wipe) registered in advance as effects having a large visual change when switching still images. DVE) is inserted.
  • special effects such as page turning and wipe
  • DVE digital image
  • the presentation method determination unit 202 plays the same BGM during presentation of the still image pairs.
  • the presentation method determination unit 202 stops BGM or switches to a different BGM when switching still images.
  • the presentation method determination unit 202 may insert jingles between different types of still images. Thereby, when the object contained in the continuous still image pair is the same type, the still image pair is smoothly connected without any change in image or sound.
  • the relevancy determination unit 204 may determine the relevance based on the commonality of the shooting locations of two still images included in the still image pair.
  • the relationship 5 is the relationship determined in this way. “The shooting locations are the same” means that the locations where the still image pairs that were consecutive in the slide show were shot are the same.
  • the relevancy determination unit 204 sets 1 in the relevance flag for relevance 5 when the shooting location of a certain still image is the same as the shooting location of the next still image.
  • the relevancy determination unit 204 sets 0 for the relevance flag for relevance 5.
  • the relevance determination unit 204 can determine the identity of the shooting location based on the similarity of an area (background area) other than the target area in the still image. For example, the relationship determination unit 204 may separate the target area and the background area from the still image. Then, the relevancy determination unit 204 may determine the same shooting location when the image feature values extracted from the background region are similar.
  • the relevancy determination unit 204 may determine the identity of the shooting location by a method other than the above.
  • the relevancy determination unit 204 may determine the similarity of the shooting location and the similarity of the background between still images consecutive in the slide show.
  • the relevancy determination unit 204 may determine the identity of the shooting location based on the identity of the background areas in all the still images included in the slide show.
  • the relevancy determination unit 204 may determine the identity of the shooting location by combining the shooting location as meta information and the GPS as sensor information in addition to the image information. For example, when three images taken at the same shooting location are consecutive, the presentation method determination unit 202 assumes that there is no change in relevance and, for example, gradually reduces the presentation time at the same time interval. Specifically, the presentation method determination unit 202 controls the presentation method based on the following rules.
  • the presentation method determination unit 202 determines the presentation time of still image pairs based on the identity of the shooting locations of consecutive still image pairs. For example, the presentation method determination unit 202 sets the presentation time of a still image presented first among the still images captured at the same place as the initial value Ts. Then, the presentation method determination unit 202 determines the presentation time of the subsequent still image based on Ts. In addition, the presentation method determination unit 202 may set Tp as the presentation time of a still image with high visibility among the still images captured at the same place. And the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Tp.
  • the presentation method determination unit 202 may set the presentation time of the next still image in which the presentation time of the still image is equal to or less than Tq in the still image group captured at the same place as the initial value Ts. And the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Ts. In addition, the presentation method determination unit 202 may set the presentation time of the last presented image in the group of still images captured at the same place as Ts. The presentation method determination unit 202 may calculate values of Ts and Tp according to the number of images to be presented from preset presentation time of the entire slide show. When successive still image pairs are taken at different locations, the presentation method determination unit 202 determines the presentation time of the subsequent still image independently of the presentation time of the previous still image.
  • the presentation method determination unit 202 may set the presentation time of the subsequent still image to the initial value Ts.
  • the presentation method determination unit 202 may set the presentation time of the subsequent still image to a random value within a specified range.
  • (5-2) Rules regarding effects, BGM, and jingles The presentation method determination unit 202 determines an effect, a BGM, and a jingle to be inserted between a pair of still images based on the identity of the shooting locations of consecutive still image pairs. For example, when consecutive still image pairs are shot at the same place, the presentation method determination unit 202 uses special effects (such as dissolves and fades) registered in advance as effects with little visual change when switching still images. Insert.
  • the presentation method determination unit 202 displays special effects (DVE such as page turning and wipe) that are registered in advance as effects having a large visual change when switching still images. ) Is inserted. Further, for example, when consecutive still image pairs are photographed at the same place, the presentation method determination unit 202 plays the same BGM during presentation of the still image pairs. When consecutive still image pairs are captured at different locations, the presentation method determination unit 202 stops BGM or switches to a different BGM when switching still images. In addition, the presentation method determination unit 202 may insert a jingle between still images taken at different places. Thereby, when continuous still image pairs are taken at the same place, the still image pairs are smoothly connected without any change in image or sound.
  • DVE page turning and wipe
  • the relevancy determination unit 204 may determine the relevance based on the commonality of the shooting time zones of two still images included in the still image pair.
  • the relationship 6 is the relationship determined in this way. “The shooting time zone is the same” means that the time zone in which a pair of still images is taken in the slide show is the same.
  • the relevancy determination unit 204 sets 1 in the relevance flag for relevance 6 when the shooting time zone of a certain still image is the same as the shooting time zone of the next still image.
  • the relevancy determination unit 204 sets a relevance flag for relevance 6 to 0 when the shooting time zone of a certain still image is different from the shooting time zone of the next still image.
  • the relevancy determination unit 204 can determine the identity of the shooting time period based on the color information of the background area in the still image. For example, the relevance determination unit 204 divides a day into a plurality of time zones, and holds the statistics of the color histogram of sunlight in each time zone.
  • the relevancy determination unit 204 determines that the still image is captured during that time zone.
  • the relevancy determination unit 204 estimates the shooting time period of each still image.
  • the relevance determination unit 204 determines that the shooting time zones are the same when the estimated time is the same.
  • the relevancy determination unit 204 may determine the identity of the shooting time period by a method other than the above.
  • the relevance determination unit 204 may determine the identity of the shooting time zones based on the similarity of the shooting time zones between consecutive still images in the slide show. Alternatively, the relevancy determination unit 204 may determine the identity of the shooting time zone based on the identity of the shooting time zones in all the still images included in the slide show.
  • the relevancy determination unit 204 may determine the identity of the shooting time period in combination with the shooting time that is meta information in addition to the image information.
  • the presentation method determination unit 202 assumes that there is no change in relevance when three images taken in the same shooting time period are consecutive, and for example, gradually reduces the presentation time at the same time interval.
  • the presentation method determination unit 202 controls the presentation method based on the following rules. [Rules according to the identity of the shooting period] (6-1) Rules regarding presentation time
  • the presentation method determination unit 202 determines the presentation time of still image pairs based on the identity of the shooting time zones of consecutive still image pairs. For example, the presentation method determination unit 202 sets the presentation time of the first still image to be presented as the initial value Ts in the group of still images taken in the same time zone.
  • the presentation method determination unit 202 determines the presentation time of the subsequent still image based on Ts.
  • the presentation method determination unit 202 may set the presentation time of a still image with high visibility among the still image groups taken in the same time zone as Tp.
  • the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Tp.
  • the presentation method determination unit 202 may set the presentation time of the next still image in which the presentation time of the still image is equal to or less than Tq among the still image groups photographed in the same time zone as the initial value Ts.
  • the presentation method determination part 202 may determine the presentation time of a subsequent still image on the basis of Ts.
  • the presentation method determination unit 202 may set the presentation time of the image to be presented last among the still image groups captured in the same time zone as Ts.
  • the presentation method determination unit 202 may calculate the values of Ts and Tp according to the number of images to be presented from the preset presentation time of the entire slide show. When successive still image pairs are photographed at different time periods, the presentation method determination unit 202 determines the presentation time of the subsequent still image independently of the presentation time of the previous still image.
  • the presentation method determination unit 202 may set the presentation time of subsequent still images to, for example, the initial value Ts.
  • the presentation method determination unit 202 may set the presentation time of the subsequent still image to a random value within a specified range.
  • the presentation method determination unit 202 determines an effect, a BGM, and a jingle to be inserted between a pair of still images based on the identity of the shooting time zones of consecutive still image pairs. For example, when a pair of still images is shot in the same time zone, the presentation method determination unit 202 can use special effects (dissolve or fade) that are registered in advance as effects with little visual change when switching still images. Etc.). The presentation method determination unit 202, when a pair of still images is shot at different time zones, special effects (such as page turning and wipe) registered in advance as effects having a large visual change when switching still images. DVE) is inserted.
  • special effects such as page turning and wipe
  • the presentation method determination unit 202 plays the same BGM during presentation of still image pairs.
  • the presentation method determination unit 202 stops BGM or switches to a different BGM when switching still images.
  • the presentation method determination unit 202 may insert jingles between still images in different time zones.
  • the slide show generation unit 203 generates a slide show based on the presentation method information input from the presentation method determination unit 202 and the image information input from the image input unit 210.
  • the relevancy determination unit 204 may group all target areas detected from all still images included in the slide show based on similarity. Then, the relevancy determination unit 204 may determine that the target areas are the same when the target areas detected from the adjacent still image pairs belong to the same group.
  • the detection method of the target area is divided into a detection method for detecting a specific target registered in advance and a detection method for detecting a general target that is not registered.
  • the relevance determination unit 204 may use the registered image data of each target as a template. Then, the relevancy determination unit 204 may scan the input image with templates converted into various resolutions. Then, the relevancy determination unit 204 may detect a region having a small difference in pixel values at the same position as the template as a corresponding target region.
  • the relevancy determination unit 204 determines the identity, the magnitude relationship, and the partial relationship as the relevance type.
  • the presentation rule determination method is the same as in the first embodiment.
  • the image input unit 210 inputs image information of all still images to the relevancy determination unit 204 (step S1001).
  • the relevancy determination unit 204 extracts image feature amounts from all still images.
  • the relevancy determination unit 204 groups the still images obtained by photographing the same object based on the similarity of the image feature amounts (step S1003). In the example of FIG.
  • still images 501, 502, 503, and 504 are classified into group A, and still images 505, 506, 507, 508, 509, and 510 are classified into group B, 511, 512, and 513.
  • the relevancy determination unit 204 determines the magnitude relationship and the partial relationship between still images belonging to the same group (S1005).
  • the relevancy determination unit 204 extracts local feature points such as SIFT from all still images and takes corresponding points with each other. Then, for group A, it can be seen that still images 502, 503, and 504 are included in the still image 501. It can be seen that the common areas of the still images 502, 503, and 504 are small.
  • the relevancy determination unit 204 determines that there is a magnitude relationship between the still image 501 and the still image 502. Then, the relationship comparison unit 201 determines that there is a partial relationship between the still image 502 and the still image 503, and the still image 503 and the still image 504. The relevancy determination unit 204 compares local feature amounts in the same manner, so that it can be understood that the still images 506, 507, 508, 509, and 510 are included in 505 for the group B. Further, it can be seen that the still images 506 and 507 are included in the still image 513.
  • the presentation method determination unit 202 determines the presentation time length 1101 and the effect 1102 as shown in FIG. 11 (S1007).
  • the slide show generating unit 203 generates a slide show using the determined presentation method (S1009).
  • the system or apparatus which combined the separate characteristic contained in each embodiment how was included in the category of this invention.
  • the present invention may be applied to a system composed of a plurality of devices, or may be applied to a single device.
  • the present invention can also be applied to a case where an information processing program that implements the functions of the embodiments is supplied directly or remotely to a system or apparatus.
  • each of the information processing apparatus 100 and the information processing apparatus 200 can be realized by a computer and a program for controlling the computer, dedicated hardware, or a combination of the computer and the program for controlling the computer and dedicated hardware.
  • the relevance comparison unit 101, the presentation method determination unit 102, the slide show generation unit 103, the relevance comparison unit 201, the presentation method determination unit 202, the slide show generation unit 203, and the relevance determination unit 204 are, for example, from a recording medium that stores a program.

Abstract

Le problème à résoudre dans le cadre de la présente invention consiste à présenter des images fixes en fonction des changements des degrés d'association entre ces dernières. La solution proposée consiste en un dispositif de traitement d'informations qui est caractérisé par l'utilisation d'un moyen de comparaison, d'un moyen de détermination d'un procédé de présentation et d'un moyen de génération. Etant donné une collection d'images fixes qui contient au moins trois images fixes, le moyen de comparaison compare un premier degré d'association à un second degré d'association. Ledit premier degré d'association indique le degré d'association entre les images fixes qui constituent une première paire d'images fixes qui fait partie de la collection d'images fixes, et le second degré d'association indique le degré d'association entre les images fixes qui constituent une seconde paire d'images fixes. Sur la base du passage du premier degré d'association au second degré d'association, le moyen de détermination d'un procédé de présentation utilise des premières informations pour déterminer des secondes informations, lesdites premières informations spécifiant le procédé de présentation pour la première paire d'images fixes et lesdites secondes informations spécifiant le procédé de présentation pour la seconde paire d'images fixes. Sur la base du procédé de présentation déterminé par le moyen de détermination d'un procédé de présentation, le moyen de génération génère un diaporama qui contient la première paire d'images fixes et la seconde paire d'images fixes.
PCT/JP2012/061788 2011-05-12 2012-04-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations WO2012153744A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-107103 2011-05-12
JP2011107103A JP2014170979A (ja) 2011-05-12 2011-05-12 情報処理装置、情報処理方法および情報処理プログラム

Publications (1)

Publication Number Publication Date
WO2012153744A1 true WO2012153744A1 (fr) 2012-11-15

Family

ID=47139222

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/061788 WO2012153744A1 (fr) 2011-05-12 2012-04-27 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations

Country Status (2)

Country Link
JP (1) JP2014170979A (fr)
WO (1) WO2012153744A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015035727A (ja) * 2013-08-09 2015-02-19 株式会社リコー 表示システム、情報端末、表示装置、再生制御プログラム、再生プログラム及び再生制御方法
CN111083361A (zh) * 2019-12-11 2020-04-28 维沃移动通信有限公司 图像获取方法及电子设备
CN114886417A (zh) * 2022-05-10 2022-08-12 南京布尔特医疗技术发展有限公司 一种智能化安全护理监控系统及方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6532283B2 (ja) * 2015-05-12 2019-06-19 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム
US11341378B2 (en) 2016-02-26 2022-05-24 Nec Corporation Information processing apparatus, suspect information generation method and program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005348371A (ja) * 2004-06-07 2005-12-15 Fuji Photo Film Co Ltd 電子アルバム表示システム、電子アルバム表示方法、及び電子アルバム表示プログラム
JP2006129453A (ja) * 2004-09-29 2006-05-18 Nikon Corp 画像再生装置、および画像再生プログラム
JP2006261877A (ja) * 2005-03-16 2006-09-28 Casio Comput Co Ltd 画像再生装置およびプログラム
JP2008061032A (ja) * 2006-08-31 2008-03-13 Sony Corp 画像再生装置及び画像再生方法、並びにコンピュータ・プログラム
WO2008133046A1 (fr) * 2007-04-13 2008-11-06 Nec Corporation Dispositif de groupement de photographies, procédé de groupement de photographies et programme de groupement de photographies
JP2010021819A (ja) * 2008-07-11 2010-01-28 Casio Comput Co Ltd 画像表示装置、画像表示方法、及び、プログラム
JP2010206508A (ja) * 2009-03-03 2010-09-16 Olympus Imaging Corp 表示装置、撮像装置および表示装置用プログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005348371A (ja) * 2004-06-07 2005-12-15 Fuji Photo Film Co Ltd 電子アルバム表示システム、電子アルバム表示方法、及び電子アルバム表示プログラム
JP2006129453A (ja) * 2004-09-29 2006-05-18 Nikon Corp 画像再生装置、および画像再生プログラム
JP2006261877A (ja) * 2005-03-16 2006-09-28 Casio Comput Co Ltd 画像再生装置およびプログラム
JP2008061032A (ja) * 2006-08-31 2008-03-13 Sony Corp 画像再生装置及び画像再生方法、並びにコンピュータ・プログラム
WO2008133046A1 (fr) * 2007-04-13 2008-11-06 Nec Corporation Dispositif de groupement de photographies, procédé de groupement de photographies et programme de groupement de photographies
JP2010021819A (ja) * 2008-07-11 2010-01-28 Casio Comput Co Ltd 画像表示装置、画像表示方法、及び、プログラム
JP2010206508A (ja) * 2009-03-03 2010-09-16 Olympus Imaging Corp 表示装置、撮像装置および表示装置用プログラム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015035727A (ja) * 2013-08-09 2015-02-19 株式会社リコー 表示システム、情報端末、表示装置、再生制御プログラム、再生プログラム及び再生制御方法
CN111083361A (zh) * 2019-12-11 2020-04-28 维沃移动通信有限公司 图像获取方法及电子设备
CN114886417A (zh) * 2022-05-10 2022-08-12 南京布尔特医疗技术发展有限公司 一种智能化安全护理监控系统及方法
CN114886417B (zh) * 2022-05-10 2023-09-22 南京布尔特医疗技术发展有限公司 一种智能化安全护理监控系统及方法

Also Published As

Publication number Publication date
JP2014170979A (ja) 2014-09-18

Similar Documents

Publication Publication Date Title
US11321385B2 (en) Visualization of image themes based on image content
US11132578B2 (en) System and method for creating navigable views
US11094131B2 (en) Augmented reality apparatus and method
KR101605983B1 (ko) 얼굴 검출을 이용한 이미지 재구성
US8548249B2 (en) Information processing apparatus, information processing method, and program
Goferman et al. Context-aware saliency detection
US8908976B2 (en) Image information processing apparatus
WO2012153744A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
CN111881755B (zh) 一种视频帧序列的裁剪方法及装置
CN111638784B (zh) 人脸表情互动方法、互动装置以及计算机存储介质
CN111491187A (zh) 视频的推荐方法、装置、设备及存储介质
CN110418148B (zh) 视频生成方法、视频生成设备及可读存储介质
JP5776471B2 (ja) 画像表示システム
US20160140748A1 (en) Automated animation for presentation of images
WO2012153747A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
JP2009289210A (ja) 重要物体認識装置および重要物体認識方法ならびにそのプログラム
CN109791556B (zh) 一种用于从移动视频自动创建拼贴的方法
CN115988262A (zh) 用于视频处理的方法、装置、设备和介质
JP5685958B2 (ja) 画像表示システム
Huang et al. Automatic detection of object of interest and tracking in active video
KR101573482B1 (ko) 프레임 클러스터링을 이용한 광고 삽입 장치 및 방법
WO2023039865A1 (fr) Procédé de traitement d'image, procédé de traitement vidéo, procédé d'apprentissage, dispositif, produit programme et support de stockage
Shankar et al. A novel semantics and feature preserving perspective for content aware image retargeting
CN116980695A (zh) 视频处理方法、装置、设备及存储介质
Chapdelaine et al. Designing caption production rules based on face, text, and motion detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12782029

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12782029

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP