WO2011125743A1 - コンテンツ処理装置および方法、並びにプログラム - Google Patents
コンテンツ処理装置および方法、並びにプログラム Download PDFInfo
- Publication number
- WO2011125743A1 WO2011125743A1 PCT/JP2011/058021 JP2011058021W WO2011125743A1 WO 2011125743 A1 WO2011125743 A1 WO 2011125743A1 JP 2011058021 W JP2011058021 W JP 2011058021W WO 2011125743 A1 WO2011125743 A1 WO 2011125743A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- content
- image
- displayed
- timeline
- representative image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 238000012545 processing Methods 0.000 title claims abstract description 34
- 238000000605 extraction Methods 0.000 claims abstract description 27
- 239000000284 extract Substances 0.000 claims abstract description 12
- 230000002123 temporal effect Effects 0.000 claims description 14
- 230000008859 change Effects 0.000 claims description 8
- 230000006870 function Effects 0.000 claims description 2
- 238000003672 processing method Methods 0.000 claims description 2
- 230000008569 process Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 12
- 238000004891 communication Methods 0.000 description 3
- 230000012447 hatching Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/16—Analogue secrecy systems; Analogue subscription systems
- H04N7/173—Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
- H04N7/17309—Transmission or handling of upstream communications
- H04N7/17318—Direct or substantially direct transmission and handling of requests
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/73—Querying
- G06F16/738—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/7847—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content
- G06F16/785—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content using colour or luminescence
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/23418—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
- H04N21/4828—End-user interface for program selection for searching program descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8146—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
- H04N21/8153—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/84—Generation or processing of descriptive data, e.g. content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8227—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
Definitions
- the present invention relates to a content processing apparatus, method, and program, and more particularly, to a content processing apparatus, method, and program that make it possible to more easily grasp the content of moving image content.
- a waveform image such as an audio signal or a change in the luminance value of a pixel
- a timeline display an example of displaying a waveform image such as an audio signal or a change in the luminance value of a pixel
- an index video is recorded by dropping a normal video frame.
- the index video is played back for the selected video.
- the conventional technique has a problem that the user cannot easily find a scene including a desired image or the user cannot easily search for a desired characteristic scene.
- the content can be searched as an image to be searched for an individual image, a portion where a plurality of images are sequentially displayed, or a time ratio during which a desired image is displayed in the content is set to other images. It was not possible to compare with the percentage of time displayed.
- the display becomes complicated in the case of content that is long in time, and such display makes it easy for the user to understand the content of the content Very Hard to think.
- the present invention has been made in view of such a situation, and makes it possible to more easily grasp the contents of moving image content.
- One aspect of the present invention is a metadata extraction unit that extracts metadata including information specifying a representative image that is a still image and a frame corresponding to the representative image from the input video content; And a timeline display means for generating display data for displaying the content on a timeline, and the timeline display means is configured to display a thumbnail of the representative image in the content when instructed to display the thumbnail.
- the content processing apparatus displays a thumbnail of the representative image together with the content displayed on the timeline in association with the temporal position of the frame of the representative image.
- the metadata includes information on scene change points of the content
- the timeline display means identifies scenes that constitute the content based on the metadata, and sets the time length of the scene.
- the figure displaying each of the scenes correspondingly is displayed on the timeline by displaying the horizontal or vertical direction of the screen as the time axis, and the thumbnail of the representative image is associated with a part of the figure. It can be displayed.
- the timeline display means can display a graphic for displaying the scene in a representative color of the scene specified by a predetermined method to display the content on the timeline.
- the timeline display means reproduces the content and displays a moving image at a predetermined position on the screen, and displays the frame being reproduced in the content.
- a slider for specifying the position on the time axis can be displayed.
- the timeline display means is configured to change a display mode of the thumbnail corresponding to the representative image when the position on the time axis specified by the slider is a frame position of the representative image. can do.
- the timeline display means includes a representative image display unit that displays a list of the representative images, a video display unit that displays a video of the reproduced content, and a timeline display unit that displays the content along with the thumbnails in a timeline Display data for displaying a screen having the above can be generated.
- the metadata extraction unit extracts metadata including information specifying a representative image that is a still image and a frame corresponding to the representative image from the input video content, and displays the timeline Means for generating display data for displaying the content on a timeline based on the metadata, and when the display of the thumbnail of the representative image is instructed, the representative in the content
- a thumbnail of the representative image is displayed together with the content displayed on the timeline in association with a temporal position of an image frame.
- a computer extracts metadata including metadata that includes information specifying a representative image that is a still image and a frame corresponding to the representative image from the content of an input moving image, Timeline display means for generating display data for displaying the content on the timeline based on the metadata, and the timeline display means is configured to display the content when the display of the thumbnail of the representative image is instructed.
- the thumbnail of the representative image is displayed in association with the time position of the frame of the representative image together with the content displayed on the timeline.
- metadata including information specifying a representative image that is a still image and a frame corresponding to the representative image is extracted from the input video content, and based on the metadata, the metadata
- the representative image is associated with the temporal position of the frame of the representative image in the content.
- a thumbnail is displayed together with the content displayed on the timeline.
- FIG. 5 is a diagram illustrating an example when a search is performed on the screens of FIGS. 2 to 4. It is a figure which shows the example of the screen as which the search result was displayed in the screen shown by FIG.
- FIG. 16 is a block diagram illustrating a configuration example of a personal computer.
- FIG. 1 is a block diagram showing a configuration example of a content processing apparatus according to an embodiment of the present invention.
- the content processing apparatus 10 displays the input moving image content on the timeline, searches for a predetermined image from the content as necessary, and displays the search result on the timeline.
- the content processing apparatus 10 is provided with a content input unit 21, a metadata extraction unit 22, an image feature quantity extraction unit 23, a metadata database 24, and an image feature quantity database 25.
- the content processing apparatus 10 includes a search image input unit 26, a search unit 27, and an output unit 28.
- the content input unit 21 receives input of content data.
- the content is a moving image content, and includes audio, captions, and the like as necessary. These contents are, for example, contents edited as part of a broadcast program.
- the metadata extraction unit 22 analyzes the content data supplied from the content input unit 21 and extracts metadata from the content.
- the metadata is, for example, information on the scene change point of the content, information on the time necessary for displaying the timeline described later, and information on the representative image of the content.
- the representative image of the content included in the metadata is a frame image (still image) corresponding to the scene where the level of the audio signal is the highest among the scenes constituting the content, and has a predetermined method. It is assumed that a representative image has been extracted. Information such as the representative image data and the frame number of the frame corresponding to the representative image is also extracted by the metadata extraction unit.
- the metadata extracted by the metadata extraction unit is stored in the metadata database 24 in association with content identification information, for example.
- the image feature amount extraction unit 23 analyzes the data of the content supplied from the content input unit 21, and extracts the image feature amount from the content.
- the image feature amount is information used to obtain a similarity with the search image in the processing of the search unit 27 described later.
- the image feature amount is, for example, information that divides each still image of one frame constituting the content into a plurality of predetermined regions and describes a representative color of each region. Further, the image feature amount may be information of a histogram of pixel values of still images for one frame.
- the image feature amount extracted by the image feature amount extraction unit 23 is stored in the image feature amount database 25 in association with, for example, content identification information and a frame number.
- the image feature quantity extraction unit 23 extracts the image feature quantity from the search image input from the search image input unit 26 in the same manner.
- the search image input unit 26 accepts input of a search image that is a still image.
- the search image is, for example, an image arbitrarily selected by the user, and is input to search for an image similar to the search image from the content input from the content input unit 21.
- the search unit 27 compares the image feature amount of the search image extracted by the image feature amount extraction unit 23 with the image feature amount stored in the image feature amount database 25 by a predetermined method. As a result, the similarity between the image feature amount of the search image and each image feature amount of the still image for one frame constituting the content stored in the image feature amount database 25 is calculated as a numerical value. ing.
- the search unit 27 specifies, for example, still images having image feature amounts whose similarity to the image feature amount of the search image is equal to or greater than a predetermined threshold, and outputs the frame numbers and the like of these still images. To supply.
- the output unit 28 reads content metadata from the metadata database 24 and generates display data necessary for timeline display of the content.
- the output unit 28 reads the metadata of the content from the metadata database 24, and displays the search result on the timeline based on the frame number of the still image supplied from the search unit 27 and the read metadata. Display data to generate.
- the display data output from the output unit 28 is supplied to a display (not shown) or the like and displayed as an image as described later.
- FIG. 2 is a diagram showing an example of a screen displayed on a display (not shown) based on display data output from the output unit 28.
- the screen shown in FIG. 2 has a moving image display unit 51.
- the content image input from the content input unit 21 is displayed as a moving image in the moving image display unit 51.
- the screen shown in FIG. 2 has a representative image display unit 52.
- the screen shown in FIG. 2 has a timeline display unit 53.
- the content is displayed on the timeline so that the horizontal direction in the figure corresponds to the time axis of the content. That is, the content is displayed corresponding to the time axis so that the left end of the timeline display unit 53 corresponds to the start time of the content and the right end of the timeline display unit 53 corresponds to the end time of the content.
- the timeline display unit 53 includes a scene display area 71.
- each scene of the content is displayed as a rectangle having a width (length) corresponding to the temporal length of each scene. That is, in the scene display area 71, eight rectangles are displayed, and it can be seen that this content is composed of eight scenes.
- the start point and end point of each scene are specified based on the scene change point information included in the metadata read from the metadata database 24, and the rectangle of the scene display area 71 is displayed. ing.
- the rectangles shown in the scene display area 71 are all displayed as white rectangles in the drawing for convenience.
- the rectangles are displayed in the representative colors of the respective scenes.
- the representative color of the scene is specified as, for example, the color corresponding to the largest pixel value among the pixel values of all frames existing in the scene.
- the representative color of the scene may be specified by other methods. In short, a color suitable for the impression of the scene may be set as the representative color.
- the representative images are respectively displayed on the upper side of the scene display area 71 in the figure. That is, nine representative images displayed on the representative image display unit 52 are displayed at positions corresponding to the frame numbers in the content.
- the representative image displayed in the first row and first column of the representative image display unit 52 is an image of a frame included in the first scene of the content, and in the upper part of the scene display area 71, the thumbnail 72-1 is displayed. It is displayed as. Note that a dotted line indicating the position of the frame is attached to the thumbnail 72-1 toward the leftmost rectangle in the drawing of the scene display area 71.
- a dotted line is drawn from the left and right ends of the thumbnail 72-1 in the drawing toward the upper center of the leftmost rectangle in the drawing of the scene display area 71 and a point on the left slightly.
- the temporal position of the frame is shown.
- the representative image displayed in the first row and the first column of the representative image display unit 52 is a frame at a position corresponding to the center of the leftmost rectangle in the drawing of the scene display area 71 on the time axis and a point slightly on the left. It is shown that this is an image. By doing in this way, the user can grasp
- thumbnails are displayed in association with the content displayed on the timeline so that the temporal position of the representative image in the content can be grasped.
- each thumbnail may be generated based on the frame data of the representative image, or may be generated in advance and included in the metadata.
- the thumbnail 72-2 represents the representative image displayed in the second row and third column of the representative image display unit 52, and it can be seen that the thumbnail 72-2 is an image of a frame included in the second scene of the content.
- the thumbnail 72-3 represents the representative image displayed in the third row and the third column of the representative image display unit 52, and it can be seen that the thumbnail 72-3 is an image of a frame included in the third scene of the content.
- thumbnails 72-4 to 72-9 are displayed, and thumbnails representing each of the nine representative images displayed on the representative image display unit 52 are displayed above the scene display area 71. It will be.
- thumbnails are displayed, for example, alternately superimposed.
- the thumbnail 72-1 is displayed so as to overlap the thumbnail 72-2, and a part of the image is hidden.
- the thumbnails displayed in a superimposed manner for example, the thumbnails displayed in a superimposed manner are displayed with a transparency of 50%, and the thumbnails displayed in a superimposed manner are displayed so as to show through.
- buttons 81 to 83 shown in FIG. 2 are configured as GUI components, for example.
- thumbnails 72-1 to 72-9 are displayed on the upper part of the scene display area 71, respectively. That is, initially, when the screen shown in FIG. 2 is displayed without the thumbnails 72-1 to 72-9 being displayed and the button 83 is operated, the thumbnail 72- is displayed above the scene display area 71. 1 to thumbnails 72-9 are respectively displayed.
- the button 81 is a button for reproducing the content and causing the moving image display unit 51 to display the moving image.
- the button 82 is a button for stopping the reproduction of the content.
- the position of the frame currently reproduced is indicated by the slider 91 in the timeline display unit 53.
- FIG. 3 is a diagram showing an example of the screen when a predetermined time elapses when the button 81 is operated to reproduce the content on the screen shown in FIG.
- the slider 91 is, for example, a red rectangular frame displayed superimposed on the scene display area 71, and is displayed so that the length in the horizontal direction in the figure increases with the passage of time.
- the right end portion of the slider 91 represents the current content playback position. As shown in FIG. 3, the right end portion of the slider 91 has moved to the right on the scene display area 71 because a predetermined time has elapsed since the content was reproduced.
- FIG. 4 is a diagram showing an example of a screen when more time has elapsed from the state shown in FIG.
- the right end of the slider 91 is moved further to the right on the scene display area 71, and the position of the frame of the representative image corresponding to the thumbnail 72-4 matches the position of the representative image. I'm doing it.
- the thumbnail 72-4 represents the representative image displayed in the second row and first column of the representative image display unit 52, and the image displayed on the moving image display unit 51 in FIG. It matches the representative image displayed in the second row and the first column.
- the thumbnail 72-4 is enlarged and displayed so as to have a larger display area than other thumbnails.
- the thumbnail 72-4 is displayed so as to overlap the thumbnail 72-3 and the thumbnail 72-5 so that the transparency is 0%. That is, the thumbnails 72-3 and 72-5 are displayed so as not to be seen through.
- the thumbnail representing the representative image is highlighted and displayed.
- the thumbnail display area is enlarged and displayed with the transparency being 0% overlaid so that the thumbnail is highlighted, but the thumbnail display by other methods has been described. The aspect may be changed.
- the contents can be displayed in a timeline in an easy-to-understand manner.
- the timeline display unit 53 each scene of the content and a thumbnail representing the representative image are displayed along the time axis. Therefore, when the user plays the content, what kind of scene can be seen at any time. It becomes possible to grasp in advance.
- thumbnails are sequentially highlighted in the timeline display unit 53 when the position of the right end of the slider 91 is moved. It is possible to evaluate the superiority or inferiority of editing the content with attention.
- step S21 the content input unit 21 receives input of content data.
- step S22 the metadata extraction unit 22 analyzes the content data supplied from the content input unit 21, and extracts metadata from the content. At this time, for example, information related to the scene change point of the content, information related to the time required for timeline display, and information such as a representative image of the content are extracted as metadata.
- the representative image of the content included in the metadata is extracted by a predetermined method.
- Information such as the representative image data and the frame number of the frame corresponding to the representative image is also extracted by the metadata extraction unit.
- step S23 the metadata database 24 stores the metadata extracted in the process of step S22.
- step S24 the output unit 28 reads content metadata from the metadata database 24, and generates display data necessary for timeline display of the content. Thereby, a screen as described above with reference to FIG. 2 is displayed. As described above, initially, the screen shown in FIG. 2 is displayed in a state where the thumbnails 72-1 to 72-9 are not displayed.
- step S25 the output unit 28 determines whether or not thumbnail display is instructed, and waits until it is determined that thumbnail display is instructed.
- step S25 when the button 83 in FIG. 2 is operated, it is determined in step S25 that a thumbnail display has been commanded, and the process proceeds to step S26.
- the output unit 28 appropriately generates display data for displaying those screens so as to display the screens as shown in FIGS. 3 and 4 in response to GUI operations on the screens. Has been made.
- the search image input unit 26 receives an input of a search image that is a still image.
- the search image is, for example, an image arbitrarily selected by the user, and is input to search for an image similar to the search image from the content input from the content input unit 21.
- FIG. 6 shows an example in which a search is performed on the screen described above with reference to FIGS.
- This figure shows an example in which a search image is searched from the same content as the content shown in FIGS.
- a search image display area 54 is provided, and images 101 to 103 which are search images input via the search image input unit 26 are displayed in the search image display area 54. Yes.
- three representative images are selected from the nine representative images displayed on the representative image display unit 52 as search images.
- the image displayed on the second row and the second column is the image 101
- the first row and the third column are displayed.
- the image displayed on the eye is the image 102
- the image displayed on the first row and the first column among the images displayed on the representative image display unit 52 is the image 103.
- FIG. 7 is a diagram showing an example of a screen on which the search result is displayed by operating the button 84 on the screen shown in FIG.
- the outer frames (display frames) of the images 101 to 103 displayed in the search image display area 54 are displayed in a predetermined color.
- a predetermined color is displayed by a difference in hatching pattern.
- the outer frame of the image 101 is displayed in blue
- the outer frame of the image 102 is displayed in green
- the outer frame of the image 103 is displayed in red.
- a color bar indicating the position of the frame of an image similar to each of the images 101 to 103 is displayed superimposed on a rectangle corresponding to each scene in the scene display area 71.
- the color bar shown in FIG. 7 indicates a plurality of frame positions, and is a bar having a predetermined width in the horizontal direction in the figure. That is, when a still image is searched for in the content of a moving image, normally, a plurality of images with high similarity are detected in succession, so when a frame position of an image with high similarity is colored, a color bar is displayed. It will be done.
- color bars 111-1 to 111-4 indicate the positions of image frames having a high degree of similarity to the image 101, and color bars having the same color (for example, blue) as the color of the outer frame of the image 101.
- the color bars 112-1 to 112-3 represent the positions of the frames of the image having a high degree of similarity with the image 102, and are color bars having the same color (for example, green) as the color of the outer frame of the image 102.
- the color bars 113-1 to 113-3 represent the positions of the frames of the image having a high degree of similarity to the image 103, and are color bars having the same color (for example, red) as the color of the outer frame of the image 103. Yes.
- the user can understand at a glance how long and in what part of the content the part displaying the target image (search image) is present. Furthermore, it is possible to understand at a glance how the respective parts for displaying a plurality of images (for example, the images 101 to 103) are combined in the content.
- the images 101 to 103 are missile images
- the image 101 is before the missile launch
- the image 102 is during the missile launch
- the image 103 is the image after the missile launch.
- images are often repeatedly displayed before, during, and after the launch of the missile.
- the content includes approximately four portions for displaying missile images.
- the first place is the first scene, which corresponds to the leftmost rectangle in the scene display area 71. That is, since the images before, during and after the launch of the missile are displayed in the portions corresponding to the color bar 111-1, the color bar 112-1 and the color bar 113-1, the image of the missile is displayed in this portion. It can be seen that was displayed.
- the second place is the fifth scene, which corresponds to the fifth rectangle from the left of the scene display area 71. That is, since the images before, during, and after the launch of the missile are displayed in the portions corresponding to the color bar 111-2, the color bar 112-2, and the color bar 113-2, the image of the missile is displayed in this portion. It can be seen that was displayed.
- the third place is the seventh scene, corresponding to the seventh rectangle from the left of the scene display area 71. That is, since the image before the missile launch is displayed in the portion corresponding to the color bar 111-3, it can be seen that the missile image is displayed in this portion.
- the fourth place is the eighth scene, which corresponds to the eighth rectangle from the left of the scene display area 71. That is, since the images before, during and after the launch of the missile are displayed in the portions corresponding to the color bar 111-4, the color bar 112-3, and the color bar 113-3, the image of the missile is displayed in this portion. It can be seen that was displayed.
- the user can understand the content at a glance, and can evaluate the editing technique of the editor of the content, for example. become.
- FIG. 7 an example in which an image is searched for one content has been described. However, an image may be searched for a plurality of contents.
- FIG. 8 is a diagram showing an example of a screen displayed by display data generated by the output unit 28 of the content processing apparatus 10 in FIG. 1 when images are searched for a plurality of contents.
- FIG. 8 includes a moving image display unit 151, a timeline display unit 153, and a search image display region 154. In the example of FIG. 8, seven contents are displayed on the timeline display unit 153 in a timeline.
- the timeline display unit 153 includes scene display areas corresponding to the number of contents to be searched.
- the timeline display unit 153 includes scene display areas 171-1 to 171-7.
- each scene of each content is displayed as a rectangle having a width (length) corresponding to the temporal length of each scene.
- a width (length) corresponding to the temporal length of each scene.
- three rectangles are displayed, and it can be seen that this content is composed of three scenes.
- the start point and end point of each scene are specified based on the scene change point information included in the metadata read from the metadata database 24, and are displayed in the scene display area 171-1 to the scene display area 171-7.
- the rectangle is displayed.
- Each rectangle shown in the scene display area 171-1 to the scene display area 171-7 is displayed in, for example, the representative color of the scene (however, in the drawing, all are displayed in white for convenience).
- the representative color of the scene is specified as, for example, the color corresponding to the largest pixel value among the pixel values of all frames existing in the scene. Note that the representative color of the scene may be specified by other methods. In short, a color suitable for the impression of the scene may be set as the representative color.
- a still image display area 175-1 to a still image display area 175-7 are provided on the left side of the scene display area 171-1 to the scene display area 171-7, respectively.
- Each of the images displayed in the still image display area 175-1 to the still image display area 175-7 is, for example, an image of the top frame of each content, a predetermined representative image, or the like.
- each of the character strings described on the upper side of the image displayed in the still image display area 175-1 to the still image display area 175-7 represents, for example, identification information of each content.
- the moving image display unit 151 displays a moving image obtained by reproducing the content selected by the user among the content displayed on the timeline in the timeline display unit 153.
- search image display area 154 a search image whose input has been accepted by the search image input unit 26 is displayed.
- the search image has not been input yet, and the search image is not displayed in the search image display area 154.
- the search image is, for example, an image arbitrarily selected by the user, and is input in order to search for an image similar to the search image from the content displayed on the timeline display unit 153 on the timeline.
- FIG. 9 is a diagram illustrating an example of a screen on which the images 201 to 203 are input as search images and the search results are displayed by operating the button 184 on the screen shown in FIG.
- the outer frames of the images 201 to 203 displayed in the search image display area 154 are displayed in a predetermined color.
- a predetermined color is displayed by a difference in hatching pattern.
- a color bar 211 represents the position of an image frame having a high degree of similarity with the image 201, and is a color bar having the same color as the color of the outer frame of the image 201 (for example, blue).
- a color bar 212 represents the position of an image frame having a high degree of similarity to the image 202, and is a color bar having the same color as the color of the outer frame of the image 202 (for example, green).
- the color bar 213 represents the position of an image frame having a high degree of similarity to the image 203, and is a color bar having the same color as the color of the outer frame of the image 203 (for example, red).
- the user can understand at a glance how long and in what part of the plurality of contents the portion displaying the target image (search image) is present. Furthermore, it is possible to understand at a glance how the respective parts displaying the plurality of images (for example, the images 101 to 103) are combined in the plurality of contents. Then, the search results can be displayed on one screen, and for example, the edited contents of the contents can be compared.
- the moving image display unit 151 displays a moving image obtained by reproducing the content selected by the user from the content displayed on the timeline display unit 153.
- FIG. 10 is a diagram illustrating an example of a screen when a moving image is displayed on the moving image display unit 151. In the example of FIG. 10, it is assumed that the user selects the scene display area 171-3 using a pointing device (not shown) and operates the button 181 to reproduce the content.
- the button 181 configured as a GUI is a button for reproducing the content and displaying the moving image on the moving image display unit 151.
- the button 182 is a button for stopping the content reproduction.
- the periphery of the scene display area 171-3 is highlighted because the scene display area 171-3 is selected.
- the periphery of the scene display area 171-3 is indicated by a dotted line, which indicates that it is highlighted.
- the timeline display unit 153 indicates the position of the frame currently reproduced by the slider.
- the slider 191-3 is displayed.
- the slider 191-3 is, for example, a red rectangular frame displayed superimposed on the scene display area 171-3, and is displayed so that the length in the horizontal direction in the figure increases as time passes.
- the right end portion of the slider 191-3 represents the current content playback position. Since a predetermined time has elapsed since the content was reproduced, the right end portion of the slider 191-3 has moved to the right on the scene display area 171-3.
- the slider 191-3 is displayed. However, when other content is reproduced, the content is displayed on the scene display area. A slider will be displayed.
- the search image input unit 26 receives an input of a search image.
- the search image is, for example, an image (still image) arbitrarily selected by the user, and is input to search for an image similar to the search image from the content input from the content input unit 21.
- the images 101 to 103 in FIG. 6 are input as search images.
- step S52 the image feature amount extraction unit 23 analyzes the search image input in step S51.
- step S53 the image feature quantity extraction unit 23 extracts the image feature quantity of the search image as a result of the process in step S51.
- step S54 the search unit 27 uses the image feature amount of the search image extracted by the image feature amount extraction unit 23 in the process of step S53 and the image feature amount stored in the image feature amount database 25 in a predetermined method. Compare with. As a result, the similarity between the image feature amount of the search image and each image feature amount of the still image for one frame constituting the content stored in the image feature amount database 25 is calculated as a numerical value. ing.
- step S55 the search unit 27 specifies, for example, a still image frame having an image feature amount whose similarity to the image feature amount of the search image is equal to or greater than a predetermined threshold.
- step S56 the search unit 27 notifies the output unit 28 of the search result. At this time, the frame number of the still image specified as a result of the process in step S55 is supplied to the output unit 28.
- step S57 the output unit 28 reads the content metadata from the metadata database 24, and based on the frame number of the still image supplied in the process of step S56 and the read metadata, the search result is timed. Display data for line display is generated. Thereby, for example, the screen as described above with reference to FIG. 7 or FIG. 9 is displayed.
- the output unit 28 appropriately generates display data in response to GUI operations on the screen.
- FIG. 12 is a diagram showing an example of a screen that displays the result of searching for a search image for a certain content, similar to the screen described above with reference to FIG. In this example, the content of a baseball broadcast program is retrieved and displayed on a timeline.
- the image 104 and the image 105 are displayed in the search image display area 54.
- two representative images are selected from the nine representative images displayed on the representative image display unit 52 as search images.
- the outer frames of the images 104 and 105 displayed in the search image display area 54 are displayed in a predetermined color.
- the outer frame of the image 104 is displayed in blue, and the outer frame of the image 105 is displayed in green.
- a predetermined color is displayed by a difference in hatching pattern.
- color bars representing the positions of image frames similar to the images 104 and 105 are displayed superimposed on rectangles corresponding to the respective scenes in the scene display area 71.
- Each of the color bars shown in FIG. 12 is a color bar having the same color as the color of the outer frame of the image 104 or the color of the outer frame of the image 105.
- the image 104 is an image when the pitcher throws
- the image 105 is an image in which an outfielder is running (jumping to the ball). Therefore, in the scene display area 71, a color bar representing an image frame similar to the image 104 is displayed, and in a portion where a color bar representing an image frame similar to the image 105 is displayed on the right side, the hit ball is outfield. I can see that I was flying.
- a search is made for the runner image 106 that is one level above, and there is a portion in which color bars of images similar to the image 104, the image 105, and the image 106 appear sequentially in sequence. If you find it, you can see that the batter hit a single hit.
- a search is made for the runner image 107 on the second base, and a color bar of images similar to the image 104, the image 105, and the image 107 appears sequentially in order. Is found, the batter made a two-base hit in that area.
- each rectangle each scene displayed in the scene display area 71 and a thumbnail displayed thereon, the contents can be understood more easily.
- the present invention it is possible to visualize and display the contents of content that cannot be handled by the conventional technology. Therefore, the content of the moving image content can be grasped more easily.
- the present invention it is possible to compare contents contents that cannot be handled by the conventional technology. Therefore, the content of the moving image content can be grasped more easily.
- the content is displayed on the timeline with the horizontal direction in the figure corresponding to the time axis, but the content is displayed with the vertical direction in the figure corresponding to the time axis.
- a timeline may be displayed.
- the series of processes described above can be executed by hardware or can be executed by software.
- a program constituting the software is installed from a network or a recording medium into a computer incorporated in dedicated hardware. Further, by installing various programs, it is installed from a network or a recording medium into a general-purpose personal computer 700 as shown in FIG. 13 that can execute various functions.
- a CPU (Central Processing Unit) 701 executes various processes according to a program stored in a ROM (Read Only Memory) 702 or a program loaded from a storage unit 708 to a RAM (Random Access Memory) 703. To do.
- the RAM 703 also appropriately stores data necessary for the CPU 701 to execute various processes.
- the CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704.
- An input / output interface 705 is also connected to the bus 704.
- the input / output interface 705 is connected to an input unit 706 composed of a keyboard, a mouse, etc., a display composed of an LCD (Liquid Crystal Display), etc., and an output unit 707 composed of a speaker.
- the input / output interface 705 is connected to a storage unit 708 composed of a hard disk and a communication unit 709 composed of a network interface card such as a modem and a LAN card.
- the communication unit 709 performs communication processing via a network including the Internet.
- a program constituting the software is installed from a network such as the Internet or a recording medium such as a removable medium 711.
- the recording medium shown in FIG. 13 is a magnetic disk (including a floppy disk (registered trademark)) on which a program is recorded, which is distributed to distribute the program to the user separately from the apparatus main body, Removable media consisting of optical disc (including CD-ROM (compact disk-read only memory), DVD (digital versatile disk)), magneto-optical disk (including MD (mini-disk) (registered trademark)), or semiconductor memory It includes not only those configured by 711 but also those configured by a ROM 702 in which a program is recorded, a hard disk included in the storage unit 708, and the like distributed to the user in a state of being incorporated in the apparatus main body in advance.
- a magnetic disk including a floppy disk (registered trademark)
- Removable media consisting of optical disc (including CD-ROM (compact disk-read only memory), DVD (digital versatile disk)), magneto-optical disk (including MD (mini-disk) (registered trademark)), or semiconductor memory It includes not only those configured
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Library & Information Science (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Computational Linguistics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Television Signal Processing For Recording (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
Claims (8)
- 入力された動画のコンテンツから、静止画である代表画像および前記代表画像に対応するフレームを特定する情報を含むメタデータを抽出するメタデータ抽出手段と、
前記メタデータに基づいて、前記コンテンツをタイムライン表示させるための表示データを生成するタイムライン表示手段とを備え、
前記タイムライン表示手段は、
前記代表画像のサムネイルの表示が指令された場合、前記コンテンツの中での前記代表画像のフレームの時間的位置に対応づけて前記代表画像のサムネイルを、前記タイムライン表示されたコンテンツとともに表示させる
コンテンツ処理装置。 - 前記メタデータには、前記コンテンツのシーンチェンジ点の情報が含まれ、
前記タイムライン表示手段は、
前記メタデータに基づいて、前記コンテンツを構成するシーンを特定し、
前記シーンの時間的長さに対応させて前記シーンのそれぞれを表示する図形を、画面の水平または垂直方向を時間軸として表示することで前記コンテンツをタイムライン表示させ、
前記代表画像のサムネイルを、前記図形の一部に関連付けて表示させる
請求項1に記載のコンテンツ処理装置。 - 前記タイムライン表示手段は、
前記シーンを表示する図形を、予め定められた方式により特定された前記シーンの代表色で表示して前記コンテンツをタイムライン表示させる
請求項2に記載のコンテンツ処理装置。 - 前記タイムライン表示手段は、
前記タイムライン表示されたコンテンツの再生が指令された場合、
前記コンテンツを再生して前記画面の所定の位置に動画を表示させるともに、前記コンテンツにおいて再生されているフレームの位置を前記時間軸上で特定するためのスライダを表示させる
請求項3に記載のコンテンツ処理装置。 - 前記タイムライン表示手段は、
前記スライダにより特定される前記時間軸上の位置が、前記代表画像のフレームの位置となった場合、前記代表画像に対応する前記サムネイルの表示の態様を変化させる
請求項4に記載のコンテンツ処理装置。 - 前記タイムライン表示手段は、
前記代表画像の一覧を表示する代表画像表示部と、
前記再生されたコンテンツの動画を表示させる動画表示部と、
前記サムネイルとともに前記コンテンツをタイムライン表示するタイムライン表示部とを有する画面を表示させるための表示データを生成する
請求項5に記載のコンテンツ処理装置。 - メタデータ抽出手段が、入力された動画のコンテンツから、静止画である代表画像および前記代表画像に対応するフレームを特定する情報を含むメタデータを抽出し、
タイムライン表示手段が、前記メタデータに基づいて、前記コンテンツをタイムライン表示させるための表示データを生成するステップとを含み、
前記代表画像のサムネイルの表示が指令された場合、前記コンテンツの中での前記代表画像のフレームの時間的位置に対応づけて前記代表画像のサムネイルを、前記タイムライン表示されたコンテンツとともに表示させる
コンテンツ処理方法。 - コンピュータを、
入力された動画のコンテンツから、静止画である代表画像および前記代表画像に対応するフレームを特定する情報を含むメタデータを抽出するメタデータ抽出手段と、
前記メタデータに基づいて、前記コンテンツをタイムライン表示させるための表示データを生成するタイムライン表示手段とを備え、
前記タイムライン表示手段は、
前記代表画像のサムネイルの表示が指令された場合、前記コンテンツの中での前記代表画像のフレームの時間的位置に対応づけて前記代表画像のサムネイルを、前記タイムライン表示されたコンテンツとともに表示させるコンテンツ処理装置として機能させる
プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020127025204A KR20130038820A (ko) | 2010-04-09 | 2011-03-30 | 콘텐츠 처리 장치 및 방법, 및 프로그램 |
CN2011800172341A CN102823265A (zh) | 2010-04-09 | 2011-03-30 | 内容处理装置和方法及程序 |
EP11765643.9A EP2557781A4 (en) | 2010-04-09 | 2011-03-30 | DEVICE, METHOD, AND PROGRAM FOR CONTENT PROCESSING |
US13/635,059 US9325946B2 (en) | 2010-04-09 | 2011-03-30 | Content processing apparatus and method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-090607 | 2010-04-09 | ||
JP2010090607A JP2011223326A (ja) | 2010-04-09 | 2010-04-09 | コンテンツ処理装置および方法、並びにプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011125743A1 true WO2011125743A1 (ja) | 2011-10-13 |
Family
ID=44762685
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/058021 WO2011125743A1 (ja) | 2010-04-09 | 2011-03-30 | コンテンツ処理装置および方法、並びにプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US9325946B2 (ja) |
EP (1) | EP2557781A4 (ja) |
JP (1) | JP2011223326A (ja) |
KR (1) | KR20130038820A (ja) |
CN (1) | CN102823265A (ja) |
WO (1) | WO2011125743A1 (ja) |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011223325A (ja) * | 2010-04-09 | 2011-11-04 | Sony Corp | コンテンツ検索装置および方法、並びにプログラム |
US9762967B2 (en) | 2011-06-14 | 2017-09-12 | Comcast Cable Communications, Llc | System and method for presenting content with time based metadata |
JP5832877B2 (ja) * | 2011-12-05 | 2015-12-16 | 株式会社ビジネス・ブレークスルー | 視聴覚端末、視聴覚番組の再生プログラム、再生履歴集積システムおよび視聴覚番組を遠隔配信する方法 |
JP6142551B2 (ja) * | 2013-01-31 | 2017-06-07 | 株式会社ニコン | 画像編集装置及び画像編集プログラム |
KR20140100784A (ko) * | 2013-02-07 | 2014-08-18 | 삼성전자주식회사 | 디스플레이 장치 및 디스플레이 방법 |
JP6289107B2 (ja) * | 2014-01-14 | 2018-03-07 | キヤノン株式会社 | 画像再生装置、その制御方法、および制御プログラム |
US9972121B2 (en) * | 2014-04-22 | 2018-05-15 | Google Llc | Selecting time-distributed panoramic images for display |
US9934222B2 (en) | 2014-04-22 | 2018-04-03 | Google Llc | Providing a thumbnail image that follows a main image |
USD781317S1 (en) | 2014-04-22 | 2017-03-14 | Google Inc. | Display screen with graphical user interface or portion thereof |
USD781318S1 (en) | 2014-04-22 | 2017-03-14 | Google Inc. | Display screen with graphical user interface or portion thereof |
USD780777S1 (en) | 2014-04-22 | 2017-03-07 | Google Inc. | Display screen with graphical user interface or portion thereof |
JP6398341B2 (ja) * | 2014-06-09 | 2018-10-03 | 富士通株式会社 | 映像抽出方法、映像再生方法、プログラム、及び装置 |
TWI549726B (zh) * | 2014-10-01 | 2016-09-21 | 虹映科技股份有限公司 | 運動影片特徵顯示方法與系統 |
FR3040507A1 (fr) * | 2015-08-31 | 2017-03-03 | Pic-Side | Procede d'identification et de traitement d'images |
JP6399145B2 (ja) * | 2017-04-27 | 2018-10-03 | 株式会社ニコン | 画像編集装置及び動画像の表示方法 |
CN108663374B (zh) * | 2018-05-16 | 2022-02-25 | 京东方科技集团股份有限公司 | 显示装置的测试方法、测试装置和测试系统 |
JP2019004506A (ja) * | 2018-09-06 | 2019-01-10 | 株式会社ニコン | 画像編集装置及び動画像の表示方法 |
US10460766B1 (en) | 2018-10-10 | 2019-10-29 | Bank Of America Corporation | Interactive video progress bar using a markup language |
US11099811B2 (en) | 2019-09-24 | 2021-08-24 | Rovi Guides, Inc. | Systems and methods for displaying subjects of an audio portion of content and displaying autocomplete suggestions for a search related to a subject of the audio portion |
KR20210104979A (ko) * | 2020-02-18 | 2021-08-26 | 한화테크윈 주식회사 | 영상 검색 장치 및 이를 포함하는 네트워크 감시 카메라 시스템 |
US11782980B1 (en) | 2021-03-26 | 2023-10-10 | Meta Platforms, Inc. | Video navigation normalized by relevance |
CN113438538B (zh) * | 2021-06-28 | 2023-02-10 | 康键信息技术(深圳)有限公司 | 短视频预览方法、装置、设备及存储介质 |
CN115119064B (zh) * | 2022-06-23 | 2024-02-23 | 北京字跳网络技术有限公司 | 一种视频处理方法、装置、设备及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0767073A (ja) | 1993-08-23 | 1995-03-10 | Ricoh Co Ltd | 画像記録再生装置 |
JP2001238154A (ja) | 2000-02-21 | 2001-08-31 | Sharp Corp | 動画表示装置 |
JP2003333484A (ja) * | 2002-05-15 | 2003-11-21 | Nec Corp | 番組録画再生システム、番組録画再生方法および番組録画再生プログラム |
JP2005080027A (ja) * | 2003-09-02 | 2005-03-24 | Sony Corp | 動画像データの編集装置および動画像データの編集方法 |
WO2005050986A1 (ja) * | 2003-11-19 | 2005-06-02 | National Institute Of Information And Communications Technology, Independent Administrative Agency | 映像内容の提示方法及び装置 |
JP2010028184A (ja) * | 2008-02-04 | 2010-02-04 | Fuji Xerox Co Ltd | 映像ナビゲーション方法、映像ナビゲーションシステム、及び映像ナビゲーションプログラム |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0778804B2 (ja) | 1992-05-28 | 1995-08-23 | 日本アイ・ビー・エム株式会社 | シーン情報入力システムおよび方法 |
US7409145B2 (en) * | 2003-01-02 | 2008-08-05 | Microsoft Corporation | Smart profiles for capturing and publishing audio and video streams |
CN100581230C (zh) * | 2003-04-24 | 2010-01-13 | 索尼株式会社 | 用于记录av流的信息处理装置和信息处理方法 |
US7352952B2 (en) * | 2003-10-16 | 2008-04-01 | Magix Ag | System and method for improved video editing |
WO2005101237A1 (en) * | 2004-04-14 | 2005-10-27 | Tilefile Pty Ltd | A media package and a system and method for managing a media package |
JP2006164337A (ja) * | 2004-12-02 | 2006-06-22 | Sony Corp | データ処理装置およびデータ処理方法、プログラムおよびプログラム記録媒体、並びにデータ記録媒体 |
JP4356762B2 (ja) * | 2007-04-12 | 2009-11-04 | ソニー株式会社 | 情報提示装置及び情報提示方法、並びにコンピュータ・プログラム |
US8850318B2 (en) * | 2007-04-23 | 2014-09-30 | Digital Fountain, Inc. | Apparatus and method for low bandwidth play position previewing of video content |
-
2010
- 2010-04-09 JP JP2010090607A patent/JP2011223326A/ja not_active Withdrawn
-
2011
- 2011-03-30 KR KR1020127025204A patent/KR20130038820A/ko not_active Application Discontinuation
- 2011-03-30 CN CN2011800172341A patent/CN102823265A/zh active Pending
- 2011-03-30 WO PCT/JP2011/058021 patent/WO2011125743A1/ja active Application Filing
- 2011-03-30 EP EP11765643.9A patent/EP2557781A4/en not_active Withdrawn
- 2011-03-30 US US13/635,059 patent/US9325946B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0767073A (ja) | 1993-08-23 | 1995-03-10 | Ricoh Co Ltd | 画像記録再生装置 |
JP2001238154A (ja) | 2000-02-21 | 2001-08-31 | Sharp Corp | 動画表示装置 |
JP2003333484A (ja) * | 2002-05-15 | 2003-11-21 | Nec Corp | 番組録画再生システム、番組録画再生方法および番組録画再生プログラム |
JP2005080027A (ja) * | 2003-09-02 | 2005-03-24 | Sony Corp | 動画像データの編集装置および動画像データの編集方法 |
WO2005050986A1 (ja) * | 2003-11-19 | 2005-06-02 | National Institute Of Information And Communications Technology, Independent Administrative Agency | 映像内容の提示方法及び装置 |
JP2010028184A (ja) * | 2008-02-04 | 2010-02-04 | Fuji Xerox Co Ltd | 映像ナビゲーション方法、映像ナビゲーションシステム、及び映像ナビゲーションプログラム |
Non-Patent Citations (2)
Title |
---|
COREL, VIDEO STUDIO 12 PLUS USERS GUIDE, August 2008 (2008-08-01), pages 31 * |
See also references of EP2557781A4 * |
Also Published As
Publication number | Publication date |
---|---|
KR20130038820A (ko) | 2013-04-18 |
EP2557781A1 (en) | 2013-02-13 |
US20130011120A1 (en) | 2013-01-10 |
JP2011223326A (ja) | 2011-11-04 |
CN102823265A (zh) | 2012-12-12 |
US9325946B2 (en) | 2016-04-26 |
EP2557781A4 (en) | 2014-03-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011125743A1 (ja) | コンテンツ処理装置および方法、並びにプログラム | |
JP5471749B2 (ja) | コンテンツ検索装置および方法、並びにプログラム | |
US9881215B2 (en) | Apparatus and method for identifying a still image contained in moving image contents | |
JP5010292B2 (ja) | 映像属性情報出力装置、映像要約装置、プログラムおよび映像属性情報出力方法 | |
JP4496264B2 (ja) | 電子機器及び映像表示方法 | |
US20100094441A1 (en) | Image selection apparatus, image selection method and program | |
US20080044085A1 (en) | Method and apparatus for playing back video, and computer program product | |
US20100104261A1 (en) | Brief and high-interest video summary generation | |
US20030030852A1 (en) | Digital visual recording content indexing and packaging | |
JP2008276340A (ja) | 検索装置 | |
US8300894B2 (en) | Method for decomposition and rendering of video content and user interface for operating the method thereof | |
JPH11220689A (ja) | 映像ソフト処理装置及び同処理プログラム記録記憶媒体 | |
US20080266319A1 (en) | Video processing apparatus and method | |
JP2010081531A (ja) | 映像処理装置及びその方法 | |
KR20090114937A (ko) | 녹화된 뉴스 프로그램들을 브라우징하는 방법 및 이를 위한장치 | |
JP3906854B2 (ja) | 動画像の特徴場面検出方法及び装置 | |
JP4539884B2 (ja) | 再生装置、プログラム及び電子画面を構築する方法 | |
JP2000228755A (ja) | 映像ソフト表示装置及びその為のプログラムを記録した記憶媒体 | |
JP5600557B2 (ja) | コンテンツ紹介映像作成装置およびそのプログラム | |
KR20230094541A (ko) | 비디오 주석 관리 시스템 및 방법 | |
WO2022189359A1 (en) | Method and device for generating an audio-video abstract | |
JP4760893B2 (ja) | 動画記録再生装置 | |
JP2020167581A (ja) | 映像再生装置および映像再生方法 | |
JP2006157108A (ja) | 映像記録再生装置 | |
JP2007151118A (ja) | 動画像の特徴場面検出方法及び装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180017234.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11765643 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13635059 Country of ref document: US Ref document number: 2011765643 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20127025204 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 8353/CHENP/2012 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |