WO2003088665A1 - Dispositif d'edition de metadonnees, dispositif de reproduction de metadonnees, dispositif de distribution de metadonnees, dispositif de recherche de metadonnees, dispositif d'etablissement de conditions de reproduction de metadonnees, et procede de distribution de metadonnees - Google Patents
Dispositif d'edition de metadonnees, dispositif de reproduction de metadonnees, dispositif de distribution de metadonnees, dispositif de recherche de metadonnees, dispositif d'etablissement de conditions de reproduction de metadonnees, et procede de distribution de metadonnees Download PDFInfo
- Publication number
- WO2003088665A1 WO2003088665A1 PCT/JP2003/003450 JP0303450W WO03088665A1 WO 2003088665 A1 WO2003088665 A1 WO 2003088665A1 JP 0303450 W JP0303450 W JP 0303450W WO 03088665 A1 WO03088665 A1 WO 03088665A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- metadata
- scene
- information
- unit
- content
- Prior art date
Links
- 238000009826 distribution Methods 0.000 title claims description 99
- 238000000034 method Methods 0.000 title claims description 16
- 238000005457 optimization Methods 0.000 claims abstract description 71
- 238000004458 analytical method Methods 0.000 claims description 100
- 238000011069 regeneration method Methods 0.000 claims description 52
- 230000008929 regeneration Effects 0.000 claims description 51
- 230000001172 regenerating effect Effects 0.000 claims description 16
- 238000001514 detection method Methods 0.000 claims description 13
- 238000000605 extraction Methods 0.000 claims description 9
- 239000000284 extract Substances 0.000 claims description 7
- 230000000007 visual effect Effects 0.000 claims description 4
- 230000011218 segmentation Effects 0.000 claims description 2
- 238000002716 delivery method Methods 0.000 claims 7
- 238000010586 diagram Methods 0.000 description 36
- 238000003860 storage Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 7
- 230000002123 temporal effect Effects 0.000 description 5
- 239000003086 colorant Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000012905 input function Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000002910 structure generation Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005304 joining Methods 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8543—Content authoring using a description language, e.g. Multimedia and Hypermedia information coding Expert Group [MHEG], eXtensible Markup Language [XML]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/7847—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content
- G06F16/785—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content using colour or luminescence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/07—Responding to the occurrence of a fault, e.g. fault tolerance
- G06F11/14—Error detection or correction of the data by redundancy in operation
- G06F11/1402—Saving, restoring, recovering or retrying
- G06F11/1415—Saving, restoring, recovering or retrying at system level
- G06F11/1435—Saving, restoring, recovering or retrying at system level using file system or storage system metadata
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/71—Indexing; Data structures therefor; Storage structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/7847—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content
- G06F16/786—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using low-level visual features of the video content using motion, e.g. object motion or camera motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/80—Information retrieval; Database structures therefor; File system structures therefor of semi-structured data, e.g. markup language structured data such as SGML, XML or HTML
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234336—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by media transcoding, e.g. video is transformed into a slideshow of still pictures or audio is converted into text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234363—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4668—Learning process for intelligent management, e.g. learning user preferences for recommending movies for recommending content, e.g. movies
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/84—Generation or processing of descriptive data, e.g. content descriptors
- H04N21/8405—Generation or processing of descriptive data, e.g. content descriptors represented by keywords
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
Definitions
- Metadata editing device Metadata editing device, metadata playback device, metadata distribution device, metadata search device, metadata regeneration condition setting device, and metadata distribution method
- the present invention provides a metadata editing device, a metadata playback device, a metadata distribution device, and a metadata search device that divide multimedia content including moving images and audio into a plurality of scenes and generate metadata for each of the divided scenes.
- the present invention relates to a device, a metadata regenerating condition setting device, a content distribution device, and a metadata distribution method.
- a video is divided into a plurality of scenes, and then segment information necessary for reproduction of each scene, a scene number, and an index, which is a collection of images representing the scenes, are edited.
- a means for creating an index of each index a means for giving a title indicating the purpose of the search to each index, and at the time of the search, the index is searched by the title, and the scenes of the index are successively arranged according to the order of the scene number.
- the present invention has been made to solve the above-described problems, and it is intended to generate metadata which is index information such as a structure of a content such as video data in addition to scene section information and a title.
- An object of the present invention is to obtain a metadata editing device capable of performing such operations.
- the user can collect and play back only the scenes that the user wants to see, or search for the scenes that the user wants to see using the features described in the metadata. It is an object of the present invention to obtain a metadata reproduction device, a metadata distribution device, a metadata search device, a metadata regeneration condition setting device, a content distribution device, and a metadata distribution method capable of performing the above. Disclosure of the invention
- a metadata editing apparatus divides a multimedia content including at least one of a moving image and a sound into a plurality of scenes, and scene segment information indicating a start position and an end position of a scene for each of the divided scenes.
- a hierarchical division of each scene of the multimedia content is performed based on a scene division unit for generating metadata and a scene section information metadata from the scene division unit, and the hierarchical division of the multimedia content is performed.
- a scene description editing unit that generates scene structure information metadata describing a detailed structure, the section information metadata of the scene, and the scene structure information metadata are integrated, and the contents of the multimedia content are formatted according to a predetermined format.
- a metadata description section that generates metadata describing the structure. That.
- a metadata distribution device includes: a hint information analysis unit that analyzes metadata optimization hint information describing the type and content of a descriptor included in metadata; and the analyzed metadata optimization hint.
- Information and metadata regeneration A metadata analysis / regeneration unit that analyzes metadata describing the content and structure of multimedia content including at least one of a moving image and a sound based on the conditions and regenerates the second metadata.
- a metadata distribution unit that distributes the second metadata regenerated by the metadata analysis / regeneration unit to the client terminal.
- the metadata distribution method includes a step of analyzing metadata optimization hint information describing a type of a descriptor included in the metadata; the analyzed metadata optimization hint information; Analyzing the metadata describing the content and structure of the multimedia content including at least one of a moving image and a sound based on a condition relating to the regeneration, and regenerating the second metadata; and And distributing the generated second metadata to the client terminal.
- FIG. 1 is a block diagram showing a configuration of a metadata editing device according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram showing a news video which is an example of an editing target of the metadata editing device according to Embodiment 1 of the present invention.
- FIG. 3 is a diagram showing an example of scene section information metadata of a scene of a scene division unit of the metadata editing apparatus according to Embodiment 1 of the present invention.
- FIG. 4 is a diagram showing an example of scene structure information metadata of a scene description editing unit of the metadata editing device according to Embodiment 1 of the present invention.
- FIG. 5 is a diagram showing an example of a screen image of a content reproduction / display unit and a user input unit of the metadata editing apparatus according to the first embodiment of the present invention.
- FIG. 6 is a block diagram illustrating a configuration of a metadata editing device according to Embodiment 2 of the present invention.
- FIG. 7 is a diagram illustrating an operation of the metadata editing device according to Embodiment 2 of the present invention.
- FIG. 8 is a block diagram illustrating a configuration of a metadata reproduction device according to Embodiment 3 of the present invention.
- FIG. 9 is a diagram illustrating an operation of the metadata reproduction device according to Embodiment 3 of the present invention.
- FIG. 10 is a block diagram showing a configuration of a content distribution system according to Embodiment 4 of the present invention.
- FIG. 11 is a diagram showing structure information of a content (an example of a news video) output from the metadata analysis unit of the metadata distribution server according to the fourth embodiment of the present invention.
- FIG. 12 is a diagram showing an example of the structure of the content after reconfiguration by the metadata reproduction generator of the content distribution system according to the fourth embodiment of the present invention.
- FIG. 13 is a block diagram showing a configuration of a metadata distribution server according to Embodiment 5 of the present invention.
- FIG. 14 is a diagram illustrating an example of video content for explaining metadata optimization hint information by the metadata distribution server according to the fifth embodiment of the present invention.
- FIG. 15 is a diagram illustrating a meta-data according to the fifth embodiment of the present invention.
- FIG. 6 is a diagram showing an example of description of metadata when MPEG-7 is used by a data distribution server,
- FIG. 16 is a diagram showing a format example of metadata optimization hint information used by a metadata distribution server according to Embodiment 5 of the present invention.
- FIG. 17 is a diagram showing metadata optimization hint information used by the metadata distribution server according to Embodiment 5 of the present invention.
- FIG. 18 is a flowchart showing the operation of the metadata analysis-regeneration unit of the metadata distribution server according to the fifth embodiment of the present invention.
- FIG. 19 is a flowchart showing the operation of the metadata analysis / regeneration unit of the metadata distribution server according to the fifth embodiment of the present invention.
- FIG. 20 is a block diagram showing a configuration of a metadata search server according to Embodiment 6 of the present invention.
- FIG. 21 is a flowchart showing the operation of the metadata analysis unit of the metadata search server according to Embodiment 6 of the present invention.
- FIG. 22 is a block diagram showing a configuration of a client terminal according to Embodiment 7 of the present invention.
- FIG. 23 is a block diagram showing a configuration of a content distribution server according to Embodiment 8 of the present invention.
- a metadata distribution server according to the fifth embodiment
- a metadata search server according to the sixth embodiment
- a client terminal according to Example 7 A client terminal according to Example 7, and
- Embodiment 1 multimedia content including moving images and audio is divided into a plurality of scenes, and a hierarchical structure description of scenes and metadata (index information) including feature amounts of each scene are obtained.
- the metadata editing device to be created will be described.
- FIG. 1 is a block diagram illustrating a configuration of a metadata editing device according to Embodiment 1 of the present invention. In each drawing, the same reference numerals indicate the same or corresponding parts. In FIG.
- the metadata editing device 100 includes a content playback / display unit 2, a scene division unit 3, a thumbnail image generation unit 4, a scene description editing unit 5, a text information addition unit 6, and a feature.
- An extraction unit 7, a user input unit 8, and a metadata description unit 9 are provided.
- the content playback and display unit 2 is a editing target composed of video data, audio data, etc. Play / display multimedia content 10
- the scene division unit 3 divides the content into a plurality of scenes.
- the thumbnail image generation unit 4 extracts a representative frame of the scene as a thumbnail image.
- the scene description editing unit 5 hierarchically edits the scenes by grouping the scenes divided by the scene dividing unit 3, combining the scenes, deleting the scenes, and generating the related information of the scenes.
- the text information adding unit 6 adds various types of text information to each scene.
- the feature extraction unit 7 extracts features of a scene.
- the user input unit 8 inputs instruction information from the user to the content reproduction / display unit 2, the scene division unit 3, the thumbnail image generation unit 4, the scene description editing unit 5, and the text information addition unit 6.
- the metadata description unit 9 includes a scene segmentation unit 3, a thumbnail image generation unit 4, a scene description editing unit 5, a text information addition unit 6, and scene segment information metadata 12 output from the feature extraction unit 7.
- the thumbnail image information metadata 13, scene structure information metadata 14, text information metadata 15, and feature description metadata 16 were integrated, and the contents and structure of the multimedia content were described according to the prescribed format.
- FIG. 2 is a diagram illustrating a configuration of a news video as an example of an editing target of the metadata editing apparatus according to the first embodiment. An example in which a news video having the configuration shown in FIG. 2 is edited will be described.
- the content playback 'display unit 2 of the metadata editing device 100 displays the multimedia content 10 such as video content stored in a content storage unit (not shown) via a network or the like. Play and display for editing.
- FIG. 3 is a diagram showing an example of scene section information metadata of a scene division unit of the metadata editing device according to the first embodiment.
- the section information metadata 12 shown in FIG. 3 shows an example generated from the news video shown in FIG.
- the scene division unit 3 determines the start position of each scene such as "News Digest", "Domestic News", and "International News” cut out from the news video content.
- the section information metadata 12 of the scene indicating the section information of the end position is generated.
- the scene dividing unit 3 continuously outputs the scene segment information 3 based on the scene section information metadata 12 from the scene dividing unit 3.
- the extracted scene is hierarchically edited, and the scene structure information metadata 14 is output.
- Hierarchical editing of scenes includes, for example, grouping scenes, subdividing scenes, joining scenes, and deleting scenes.
- a group of scenes is, for example, a news scene shown in Fig. 2 and a scene related to a specific feature such as "domestic news", "international news", or "economic news” as shown in Fig. 4, for example.
- News J group means to divide a scene into multiple scenes.
- FIG. 4 is a diagram showing an example of the scene structure information metadata of the scene description editing unit of the metadata editing device according to Embodiment 1.
- the scene structure information meta shown in Fig. 4 is shown.
- Data 14 is edited by the scene description editor 5 Describes the hierarchical structure of the video content generated as a result of the collection.
- the scene description editor 5 changes scenes such as “news digest” and “news” by editing scenes such as grouping scenes, subdividing scenes, and merging scenes. , "Features", “Sports”, etc., and its "News" power S "National News", “International News”, and "Economic News” are hierarchically edited.
- the thumbnail image generation unit 4 generates and generates a representative frame as a thumbnail image from the scenes cut out by the scene division unit 3 based on the scene section information metadata 12 from the scene division unit 3.
- the thumbnail information is output to the metadata description unit 9 as thumbnail image information metadata 13 and registered in the metadata description unit 9.
- the user can select a thumbnail from the user input unit 8, but automatically sets the first frame or a plurality of frames at fixed time intervals as a representative frame, automatically detects a scene change point, and automatically detects a scene change point.
- a frame can also be used as a representative frame.
- the thumbnail image information metadata 13 is position information (frame number or time) of the thumbnail in the video content or location information such as a URL of the thumbnail image.
- the feature extraction unit 7 uses the scene segment information metadata 12 from the scene division unit 3 to select a scene, such as a motion, a color, or the shape of an object included in the scene, from each scene. Extract the visual features of.
- the extracted feature amount is output to the metadata description unit 9 as feature description metadata 16 and registered.
- the text information providing unit 6 allows the user to set the title and abstract for each scene based on the scene section information metadata 12 from the scene dividing unit 3. Add various text information such as keywords, comments, scene importance, etc.
- FIG. 5 shows an example of a screen image of the content reproduction / display unit and the user input unit 8 of the metadata editing apparatus according to the first embodiment.
- the video playback screen G1 corresponds to an example of a screen image on the content playback 'display unit 2, and the video playback screen G1 plays back and displays the content for editing.
- buttons as “play”, “stop”, “rewind”, “fast forward”, and “frame forward” are provided in a normal video playback device.
- a scene division instruction screen G2 is displayed below the video reproduction screen G1.
- the scene division instruction screen G2 is, for example, in a slider format. While watching the video displayed on the video playback screen G1, the user starts and ends the video scene displayed on the video playback screen G1. The position and can be indicated. In addition, the scene division instruction screen G2 can simultaneously indicate the position of the thumbnail between the start position and the end position of the scene.
- the thumbnail image generation unit 4 generates a thumbnail image from the frame at the specified position of the video content.
- the thumbnail image whose position is specified by the scene division instruction screen G2 is displayed on the scene division information display screen G3 as scene division information.
- the scene division information display screen G3 in addition to the thumbnail images, information indicating the start position and the end position of each scene can be displayed as shown in FIG.
- the tree structure generation instruction display screen G4 the user is instructed to edit the scene.
- the user can select the thumbnail displayed on the scene division information display screen G3. While viewing the scene division information such as images, it generates an image that represents the hierarchical structure of the video content.
- As an operation method for example, when grouping scenes, add a new node on the tree and add scenes to be grouped to the node.
- a method of adding a scene a method of selecting a scene to be added on the scene division information display screen G3 and adding a scene to a node by dragging and dropping can be considered.
- a scene division information display screen G3 and a clear structure generation instructionA display interface G4 is a user interface for providing text information to the scene through the text information provision unit 6 by selecting the scene.
- the user input unit 8 is provided as a text input device for inputting text information for a scene.
- the metadata description unit 9 integrates various metadata output from the scene division unit 3, the thumbnail image generation unit 4, the scene description editing unit 5, the text information addition unit 6, and the feature extraction unit 7, and defines a prescribed description format. Generate a metadata file described according to.
- the metadata description format can be described in an original format, but in the first embodiment, MPEG-7 standardized by ISO will be used. This MPEG-7 defines formats that describe the structure and characteristics of content, and includes XML format and binary format.
- the scene description editing unit 5 that hierarchically edits a scene and the feature extracting unit 7 that extracts a feature from a scene are provided. It is possible to generate metadata that describes the hierarchical structure of content such as video data and the features of each scene.
- the multimedia content 10 input to the content reproduction / display unit 2 may be obtained from a content server (not shown) on the network or a content storage unit in the metadata editing device 100. (Not shown) In such cases, various cases are assumed, such as when the data is obtained from a storage medium (not shown) such as a CD or DVD.
- the metadata output from the metadata description unit 9 is stored in a metadata server (not shown) on the network, or stored in a metadata storage unit (not shown) in the metadata editing apparatus. It is assumed that the content is stored together with the content on a storage medium (not shown) such as CDDVD.
- a storage medium not shown
- both the scene description editing unit 5 and the feature extraction unit 7 are described. However, the present invention is not limited to this. Only the scene description editing unit 5 is provided, or only the feature extraction unit • 7 Of course, it is good to provide.
- Second Embodiment In the first embodiment, all the scenes are divided manually. However, in the second embodiment, the data is provided with a scene change detecting unit for automatically detecting a scene change point. The editing device will be described.
- FIG. 6 is a block diagram showing a configuration of a metadata editing device according to Embodiment 2 of the present invention.
- the metadata editing device 10 OA includes a content playback / display unit 2, a scene division unit 3, a thumbnail image generation unit 4, a scene description editing unit 5, and a text information provision unit. 6, a feature extraction unit 7, a user input unit 8, a metadata description unit 9, and a scene change detection unit 39.
- Reference numeral 40 is scene start position information automatically detected.
- FIG. 7 is a diagram for explaining the operation of the metadata editing device according to the second embodiment of the present invention.
- the scene change detecting section 39 automatically detects a scene change and a cut point.
- the scene change detection is performed based on, for example, a pixel difference between frames, a color difference between frames, and a histogram difference of brightness.
- the scene division unit 3 determines the start position and the end position of the scene based on the scene change point detected by the scene change detection unit 39.
- the processing of the scene change detection unit 39 and the scene division unit 3 will be described in detail by taking a case where the content to be edited is a news video as an example.
- the scene change detection unit 39 calculates a color histogram for each frame.
- the color system includes HSV, RGB, YCbCr, etc.
- HSV color space is used. This HSV color space is composed of three elements: hue (H), saturation (S), and lightness (V).
- H hue
- S saturation
- V lightness
- the average (mean) and standard deviation (sd) of the histogram differences between the first N frames are calculated based on (Equation 2) below as the initial features of the scene.
- bin H hue
- S saturation
- V lightness
- V ⁇ h Lightness histogram
- bin-1 Number of histogram elements
- Ni r mean average of histogram difference between frames (Equation 2)
- the feature amount of the template image is, for example, a histogram of the color of the template image, or a movement pattern (the part where the announcer appears in the change of the news due to the change of the news, etc.). If a single image is registered in advance, for example, as shown in FIG. The image corresponding to the scene change point is matched with the template image, and if the similarity is high, the scene change point is registered as the scene start position. The similarity matching includes a difference between frames and a histogram difference of colors between frames. If the feature amount of the template image has been registered in advance, the feature amount is extracted from the image corresponding to the scene change point, and matching with the feature amount of the template image is performed. The scene change point is registered as the start position of the scene.
- the scene division unit 3 determines the start position and the end position of the scene based on the scene start position information automatically detected by the scene change detection unit 39.
- the scene division unit 3 of the second embodiment can determine the start position and the end position of the scene based on an instruction from the user, as in the first embodiment.
- the scene division unit 3 outputs the scene interval information metadata 12 describing the start position and the end position of the scene to the scene change detection unit 39, and the scene change detection unit 39 includes the scene change information in the scene change detection unit 39.
- Scene change points can be detected.
- the scene description editing unit 5 can re-divide and integrate the scene automatically detected by the scene change detection unit 39 based on the scene section information metadata 12 from the scene division unit 3.
- the details of the scene description editing unit 5 are the same as in the first embodiment. Therefore, according to the metadata editing apparatus 100 A according to the second embodiment,
- FIG. 8 is a pictorial diagram showing a configuration of a metadata reproducing apparatus according to Embodiment 3 of the present invention.
- the metadata reproducing device 200 includes a metadata analysis unit 19, a structure display unit 20, a thumbnail image display unit 21, a user input unit 22, a search unit 23, and a search unit.
- the metadata analysis unit 19 analyzes the metadata 28 that describes the hierarchical scene structure of the content, information on thumbnails of each scene, and the feature amount of each scene.
- the structure display section 20 displays the scene structure obtained from the metadata analysis result.
- the thumbnail image display unit 21 displays thumbnail image information 30 obtained from the metadata analysis result.
- the user input unit 22 gives instructions such as search and reproduction.
- the search unit 23 performs a search based on a search instruction (search condition 31) from the user and the feature amount / text information 32 of the scene obtained from the metadata.
- the search result display section 24 displays the search results 33.
- the summary creation unit 25 creates a summary based on a summary creation instruction (summary creation condition 34) from the user.
- the summary structure display section 26 displays the structure of the summarized content.
- the content playback section 27 is used for summarizing information 35 and content playback instructions. 36. Play and display the content based on the content to be played.
- the metadata analysis unit 19 receives metadata 28 that describes the hierarchical scene structure of the content, information about thumbnails of each scene, and the features of each scene, and performs metadata analysis. Do.
- the metadata 28 is described in the MPEG-7 standard format generated by the metadata description unit 9 in the first and second embodiments, the metadata is described in XML. Text files or binary files encoded in binary format.
- the metadata analysis unit 19 has the function of an XML parser that analyzes the XML file. Also, if the metadata 28 is encoded in a binary format, it has a decoder function of decoding the metadata 28.
- the structure display unit 20 receives the analysis result of the metadata analysis unit 19 and displays a hierarchical scene structure 29 of the image. For example, as shown in FIG. 4, the scene structure of the content is displayed in a tree along with the title of each scene.
- the thumbnail image display section 21 receives the analysis result (thumbnail image information 30) of the metadata analysis section 19 and displays a list of thumbnail images of the content.
- the search unit 23 searches for a scene included in the content according to a search instruction from the user via the user input unit 22.
- the user input unit 22 inputs a search condition by presenting a keyword or a sample image.
- search section 23 Based on textual information such as scene features and scene titles described in the metadata, the search conditions (keywords and sample image features) presented by the user are searched for. .
- the search result display unit 24 receives the search result 33 from the search unit 23 and displays the search result.
- a display method of the search result for example, a thumbnail image of a scene matching the search condition is displayed.
- the summary creation unit 25 creates a summary of the content based on a summary creation instruction from the user via the user input unit 22.
- the user input unit 22 inputs information such as the reproduction time of the summarized content and user preference. For example, if the content is a news video, enter the user's preference information, such as wanting to watch mainly the sports in the news, or to summarize an hour's news in 20 minutes.
- the summary creation unit 25 creates summary information 35 that matches summary conditions based on text information 32 such as the playback time of the scene described in the metadata and the title of the scene.
- the summary information 35 is, for example, a playlist of scenes included in the summarized content, and includes location information such as a URL of the content, and a start position and an end position of the scene to be reproduced in the content. It is a list that is listed.
- the target content is specified based on the location information of the content included in the summary information 35, and the scene to be played is obtained and played back based on the scene list included in the summary information.
- the summary information may be a hierarchical description of a summarized scene structure.
- FIG. 9 is a diagram showing an example of a summarized scene structure described hierarchically.
- FIG. 3A shows an example of a scene structure of the original content. The importance is added to each scene in the range of 0.0 to 1.0. 1.0 is the most important 0. 0 means least important. Assume that the importance is calculated based on, for example, user preference.
- FIG. 9 (a) when a summary is generated only for the scene with the highest importance, the summarized scene structure is as shown in FIG. 9 (b).
- Each scene has location information such as the URL of the content including the scene, and metadata such as position information (start position and end position) within the content of the scene.
- Information on the summarized scene structure 38 is passed to the summary structure display unit 26, and the summary structure display unit 26 displays the summarized scene structure as shown in FIG. 9 (b), for example. Display in tree format.
- the scene structure displayed by the user via the user input unit 22 on the structure display unit 20 or the summary structure display unit 26, the thumbnail image display unit 21 and the search result display unit 24 are displayed.
- the content playback / display section 27 can play / display the scenes contained in the content. Therefore, according to the metadata reproducing apparatus 200 according to the third embodiment, only the scenes desired by the user are collected using the metadata generated by the metadata editing apparatus described in the first and second embodiments. You can search for the scene you want to see using the features described in the metadata.
- the content playback 'display unit 27 is a metadata playback device.
- the content playback / display unit may be in another device.
- operations and displays related to metadata playback such as the display of scene structures and thumbnail images, are performed by mobile phones and personal digital assistants.
- Processing and display related to playback of multimedia contents are performed by mobile phones.
- Via a network to a telephone or mobile information terminal It may be performed on a terminal (for example, a PC, etc.) that is connected by connecting.
- a metadata distribution server metadata distribution device
- the content is configured to be scalable according to the terminal capability of the client terminal.
- a content distribution server that distributes the content.
- FIG. 10 is a block diagram showing a configuration of a content distribution system according to Embodiment 4 of the present invention.
- the content distribution system 300 includes a metadata distribution server 400, various client terminals 48 1 to 48 n, and a content distribution server 500.
- the metadata distribution server 400 includes a metadata storage unit 41, a metadata analysis unit 42, a terminal capability determination unit 43, a metadata regeneration unit 44, and a metadata distribution unit 45. It is configured.
- the metadata storage unit 41 stores, for example, metadata generated by the metadata editing apparatuses of the first and second embodiments.
- the metadata analysis unit 42 analyzes the metadata 49 describing the structure and characteristics of the content.
- the terminal capability determination unit 43 determines the terminal capability of the client terminal based on the information 51 on the performance of the client terminal.
- the metadata regenerating unit 44 reconstructs the content according to the terminal capability of the client terminal based on the metadata analysis result 50, and regenerates the metadata 52 describing the content.
- the metadata distribution unit 45 is a metadata regeneration unit
- the metadata storage unit 41 may be provided outside the metadata distribution server 400 of the fourth embodiment. In that case, the metadata distribution server 400 inputs the metadata 49 from the metadata storage unit 41 via a network (not shown) or the like.
- the content distribution server 500 includes a content storage unit 46 and a content distribution unit 47.
- the content storage unit 46 stores the content 55.
- the content distribution unit 47 responds to the content distribution request 54 from the client terminals 48 1 to 4811.
- the content storage unit 46 may be provided outside the content distribution server 500. In that case, the content distribution server 500 inputs the content data 55 via a network (not shown).
- the metadata analysis unit 42 analyzes the metadata stored in the metadata storage unit 41.
- the operation of the metadata analysis unit 42 is the same as that of the metadata analysis unit 19 of the metadata reproduction device 200 of the third embodiment.
- the metadata analysis unit 42 obtains information on each co-structure and feature by analyzing the metadata.
- FIG. 11 is a diagram illustrating the structure information of the content (an example of a news video) output from the metadata analysis unit of the metadata distribution server according to the fourth embodiment.
- Fig. 1 1 the hierarchical scene structure of the content is displayed using a tree. Each node corresponds to each scene, and each node is associated with various scene information.
- Scene information includes scene titles, abstracts, time information of scene start and end positions, scene thumbnails, representative frames, thumbnail shots, representative shots, and visual features such as colors and movements. It is a feature.
- FIG. 11 shows only the title of the scene among various types of scene information.
- the client terminal is assumed to be various information home appliances with different terminal capabilities. Terminal capabilities include communication speed, processing speed, image formats that can be played back and displayed, image resolution, and user input functions.
- the client terminal 4 8 1, communication speed, processing speed, display performance, the user input function, assume a PC (personal computer) having a sufficient performance.
- the client terminal 482 is assumed to be a mobile phone, and the other client terminals are assumed to be PDAs.
- Information on each terminal performance is transmitted from each of the client terminals 481 to 48n.
- the terminal capability determination unit 43 analyzes the information 51 on the terminal performance transmitted from each client terminal 481-148n to determine the image format that can be distributed, the maximum image resolution, and the content length. Are determined and output to the metadata regeneration unit 44. For example, if the original content is a large-resolution video content encoded by MPEG-2, the client terminal 481 having sufficient performance can reproduce the original content.
- the client terminal 481 has a function capable of summarizing and retrieving images described in the third embodiment.
- the client terminal 482 can reproduce only a short video shot encoded by MPEG-4 and has a small maximum displayable resolution.
- each client terminal from the terminal capability determination unit 43 each client terminal from the terminal capability determination unit 43
- FIG. 12 is a diagram illustrating an example of a structure of the content after reconfiguration by the metadata regenerating unit of the content distribution system according to the fourth embodiment. As shown in Fig. 12, important scenes are extracted from the scenes of each news, so that the scenes are composed of representative shots or representative frames only.
- the metadata regeneration unit 44 regenerates metadata describing only the reconstructed scene structure and the position information of the representative shot or the representative frame of the scene, and sends the metadata to the metadata distribution unit 45. Output.
- the metadata distribution unit 45 distributes the metadata 53 generated by the metadata regeneration unit 44 to the client terminals 48 1 to 48 n.
- Each of the client terminals 48 1 to 48 n analyzes the metadata 53 distributed from the metadata distribution unit 45 and acquires the scene structure information of the content.
- each client terminal 48l to 48n When the user of each client terminal 48l to 48n selects a scene to be reproduced, the position information of the selected scene is transmitted from each client terminal 48l to 48 ⁇ to the content distribution server 500 Sent to content distribution section 47.
- the content distribution unit 47 of the content distribution server 500 acquires the position information of the scene transmitted from each client terminal 48 1 to 48 ⁇ , and
- the corresponding content 55 is obtained from 46 and distributed to the client terminals 48 1 to 48 ⁇ .
- the start and end positions of the scene Send and distribute the corresponding scene of the original content.
- the location information (URI, etc.) of the representative shot of the scene is transmitted. If the representative shot has an image format that cannot be played back on the client terminal 482, an image format that cannot be displayed, an image resolution, an image file size, etc., the content distribution unit 47 converts the format, resolution, and file size. Summarize the content to reduce the size of the content and send it.
- the metadata distribution server 400 of the fourth embodiment it is possible to regenerate metadata according to the capabilities of the client terminals 481 to 48n and distribute the metadata to each client terminal.
- the metadata distribution server 400 and the content distribution server 500 are separately configured and shown.
- the present invention is not limited to this configuration.
- a content distribution server may be provided, or a metadata distribution server may be provided in the content distribution server.
- the metadata distribution server and the content distribution server may be provided in the same server.
- the content distribution unit 47 can easily know the capabilities of the client terminals 48 1 to 48 n from the terminal capability determination unit 43, so that the client terminals 48 1 to 48
- the content can be reconfigured such as format conversion according to the capability of the client terminal n and distributed to the client terminals 481 to 48n.
- the metadata stored in the metadata storage unit 41 has been described as being generated by the metadata editing apparatus of the first and second embodiments, for example.
- the present invention is not limited to this.
- the metadata generated by devices other than the metadata editing apparatuses of the first and second embodiments may be stored.
- Fifth Embodiment In a fifth embodiment, another example of the metadata distribution server described in the fourth embodiment will be described. explain.
- the metadata is regenerated based on the terminal information transmitted from the client terminal.
- metadata analysis is performed by using metadata optimization hint information, which is a hint information for metadata regeneration, to perform metadata regeneration.
- metadata optimization hint information which is a hint information for metadata regeneration.
- FIG. 13 is a block diagram showing a configuration of a metadata distribution server according to Embodiment 5 of the present invention.
- the metadata distribution server 40 OA includes a hint information analysis unit 61, a metadata analysis / reproduction unit 63, and a metadata distribution unit 45.
- the hint information analysis unit 61 analyzes the metadata optimization hint information 60 and outputs the result.
- the metadata analysis / regeneration unit 63 is based on the analyzed metadata optimization hint information 62 and information relating to the performance of the client terminal or conditions 65 relating to metadata regeneration such as user preference. It analyzes the metadata 49 describing the structure and characteristics of the content and outputs the reconstructed metadata 64.
- the metadata distribution unit 45 distributes the metadata 53 to the client terminal.
- the metadata storage unit 41 (see Fig. 10) has metadata 49 describing the structure and characteristics of the content, and metadata optimization hint information of hint information for regenerating the metadata 49. 60 has been accumulated. Metadata optimization hint information 60 for regenerating metadata 49 is what kind of information is included in that metadata 49, how much information is included, It describes the complexity.
- the metadata optimization hint information 60 will be described in detail with reference to video content having the structure shown in FIG. 14 as an example.
- Biao content Ro ot (S occerg ame rogrm)
- Scene 2 Senee 1-2 Seenel—n
- Figure 14 shows the temporal hierarchical structure between scenes in a ll-like structure.
- Data 49 describes the temporal hierarchical structure of such content, that is, the temporal relationship between scenes and the start time and length of each scene.
- the color and motion features are described only in the level 4 video segment.
- Temporal hierarchical relationships between scenes can be expressed by describing video segments recursively.
- the description “time division” describes that one video segment is composed of a plurality of time-divided video segments.
- MPEG-7 the spatial hierarchical structure of content can be described in the same way.
- a description “space division” indicating that one segment is composed of a plurality of spatially divided segments is used.
- the metadata optimization hint information 60 for regenerating the metadata 49 describes the type and content of information (descriptor) included in the metadata 49.
- the metadata optimization hint information 60 includes a descriptor (“time division”) that expresses the temporal hierarchical structure of the content, a color histogram, and a color histogram for the metadata in FIG.
- Descriptors that express the complexity of the motion, titles, abstracts, genres, and descriptors that express the importance are included.
- the maximum depth of the hierarchical structure of the video segments is 4 (level 1 to level 4).
- the importance takes five discrete values ( ⁇ 0. 0, 0.25, 0.5, 0.75, 1.0 ⁇ ).
- viewpoint of the importance the importance when viewed from the viewpoint of "Team B" and the importance when viewed from the viewpoint of "Team B" are described.
- FIG. 16 shows a format example of the metadata optimization hint information 60.
- the metadata optimization hint information 60 shown in FIG. 16 includes metadata file information and metadata component information.
- the metadata file information includes the location of the metadata file, the size of the metadata file, the metadata file format (indicating the file format such as XML format and binary format), and the syntax file information (specifying the metadata syntax).
- the syntax file is included in the metadata Resources required to process metadata, such as the number of appearing elements indicating the number of (appearing) elements (memory size required to store and analyze metadata; required to analyze metadata) Describe information for predicting the processing system (S / W, etc.).
- syntax file information describes the location of syntax files such as DTD files and schema files.
- Metadata component information is information that describes the types of descriptors that constitute metadata and their contents.
- the metadata component information includes the name of the descriptor included in the metadata, the frequency (number of times) that the descriptor appears in the metadata, and any information that the descriptor may include grammatically. Description of whether all descriptors are included (complete descriptiveness), and if the descriptor is described recursively, the hierarchy in time or space that the descriptor has (maximum depth) Is included.
- video segment is a descriptor described recursively and has a structure of up to four layers.
- the maximum depth is 4.
- the appearance position (hierarchy) where the descriptor appears is also hint information. For example, if “weight” is a descriptor included in a “video segment” but is included in a “video segment” of level 3 or higher, that is, if it is not included in a video segment of level 4, The position where "importance" appears is up to three. In this way, the appearance position can be specified at the hierarchy level.
- the appearance position is specified.
- IDs In the case of descriptors with values, the type of descriptor and the range of values that the descriptor can take are also one of the hint information. For example, from the viewpoint of importance of "T ea mA" and "T eam B", 5 Is represented by the discrete values of ( ⁇ 0. 0, 0.25, 0.5, 0.75, 1.0 ⁇ ), the possible values of importance are floating-point values. Is a list with type ⁇ 0. 0, 0.25, 0.5, 0.75, 1.0 ⁇ . The above description is repeated for each descriptor that is a component of metadata. FIG.
- FIG. 17 shows an example of metadata optimization hint information described in accordance with the format of FIG. It can be seen that an example of the metadata optimization hint information 60 shown in FIG. 17 includes metadata file information and metadata component information of “video segment” and “title”. Next, a method of regenerating metadata using the metadata optimization hint information 60 will be described with reference to FIG.
- the hint information analysis unit 61 analyzes the metadata optimization hint information 60 described in a prescribed format.
- the metadata analysis' regeneration unit 63 uses the analyzed metadata optimization hint information 62 output from the hint information analysis unit 61 to analyze the metadata 49 and is involved in metadata regeneration.
- the metadata 64 regenerated based on the condition 65 is output.
- FIG. 18 shows an example of a metadata analysis method performed by the metadata analysis / regeneration unit 63 using the analyzed metadata optimization hint information 62.
- the metadata analysis / regeneration unit 63 specifies metadata required for regeneration from the conditions 65 for metadata regeneration (step S1).
- “importance” and “video segment” are descriptors necessary for regeneration.
- the metadata 49 includes the descriptor specified in step S1 (hereinafter, the descriptor "importance" will be described as an example). Is determined (step S2). If the metadata includes the “importance” mark, the metadata is analyzed (step S 3).
- step S4 the analysis processing of (1) is completed.
- step S5 the analysis of the video segment up to level 3 has been completed.
- step S6 the analysis process ends without performing the analysis for the level 4 or lower. Note that the processing from step S1 is repeated to analyze another metadata 49 if necessary. Also, if the number of occurrences of the “importance” descriptor is specified as 20 in the metadata optimization hint information 62, (20) In step S5), the analysis of the metadata ends (step S6).
- FIG 19 shows another example of how to analyze metadata using the analyzed metadata optimization hint information 62.
- the metadata analysis / regeneration unit 63 adds one to the ID of the appearance position described in the metadata optimization hint information 62. It is determined whether or not the video segment matches (step S13).
- step S16 If the ID does not match, the analysis of the description of the video segment is skipped because the video segment does not include the "title" descriptor (step S16).
- step S 15 the description of the video segment is analyzed to obtain a “Title” descriptor (S 15).
- the analysis process ends (step S18).
- the processing from step S11 is repeated to analyze another metadata as needed.
- the metadata 64 reconstructed with the descriptors extracted through the above analysis processing is output.
- the metadata distribution unit 45 distributes the reconstructed metadata 64 to various client terminals.
- the metadata optimization hint information corresponding to the data may be regenerated.
- all the descriptors included in the metadata had to be analyzed in order to regenerate the metadata.
- the descriptors included in the metadata 49 are analyzed.
- List ⁇ Descriptor of metadata 49 is analyzed using metadata optimization hint information 60 that describes the appearance position, number of occurrences, etc. of descriptors, so metadata for metadata regeneration 4 Omit the analysis of itself
- Embodiment 6 In Embodiment 5 described above, a metadata distribution server that reduces processing costs associated with metadata analysis and regeneration using metadata optimization hint information for metadata regeneration has been described. In the sixth embodiment, a metadata search server that uses metadata optimization hint information to reduce processing associated with metadata search
- FIG. 20 is a block diagram illustrating a configuration of a metadata search server according to Embodiment 6 of the present invention.
- the metadata search server 600 includes a hint information analysis unit 61, a metadata analysis unit 71, and a search unit 73.
- the hint information analysis unit 61 is the same as that in the fifth embodiment, and thus the description is omitted.
- the metadata analysis unit 71 uses the analyzed metadata optimization hint information 62 and search conditions 70 to reduce the huge amount of metadata 49 describing the structure and characteristics of the content. Efficiently.
- the search unit 73 searches for content that matches the search condition using the analysis result 72 of the metadata.
- FIG. 21 shows the operation of the metadata analysis unit of the metadata search server according to the sixth embodiment.
- the metadata analysis unit 71 analyzes one or more pieces of metadata using the metadata optimization hint information 62 corresponding to each piece of metadata.
- the analysis of metadata means to extract the feature description required for retrieval from the metadata. For example, when a color segment of a video segment is given as a search condition and a video segment having a feature close to the video segment is searched, it is necessary to extract a video segment having a feature description related to color.
- the metadata analysis unit 71 analyzes the search condition 70 and specifies a descriptor effective for the search (step S21).
- search conditions there are cases where a feature amount according to the description specified in MPEG-7 is given, and cases where an image or a keyword is given.
- the search condition is given as a feature value (for example, color arrangement information) according to the description of MPEG-7
- the descriptor color arrangement information
- the descriptor in text format title, abstract, annotation, etc.
- step S22 it is determined whether or not the selected descriptor is included in the metadata 49 (step S22). If the descriptor used for the search is not included in the metadata 49, the analysis processing of the metadata 49 is terminated (step S24), and another metadata 49 is analyzed as necessary. . If the selected descriptor is included in the metadata 49, the metadata is analyzed (step S23). As for the metadata analysis method, the metadata analysis processing shown in FIGS. 18 and 19 is efficiently performed using the metadata optimization hint information 62 as in the case of the fifth embodiment (step S2). 5 to S26). Through the above processing, the feature description necessary for the search is extracted from the metadata analysis unit 71.
- the search unit 73 searches for content that matches the search conditions by using the metadata analysis result (feature description required for search) 72 output from the metadata analysis unit 71.
- the description about the video segment having the color feature description (“color histogram”) is output from the metadata analysis unit 71, so that the color feature amount given as the search condition ( Histogram), and outputs the information of the video segment (for example, “time information J”) as the search result 74.
- the metadata optimization hint information Since the metadata 49 is analyzed using 60, the analysis of the metadata 49 itself for regenerating the metadata can be omitted. Since the analysis of the child can be omitted, the processing cost (processing amount, memory usage, etc.) associated with the metadata search can be reduced.
- a client terminal according to Embodiment 7 of the present invention will be described with reference to the drawings, and Fig. 22 is a block diagram showing a configuration of a client terminal according to Embodiment 7 of the present invention.
- the client terminal 48A is provided with a hint information analysis unit 80 and a metadata regenerating condition setting unit 82. Note that, among the functions provided for the client terminal 48A, FIG. data Only the part relating to means for setting conditions for metadata regeneration using the optimization hint information 60 is shown. Next, the operation of the client terminal according to the seventh embodiment will be described with reference to the drawings.
- the hint information analysis unit 80 analyzes the metadata optimization hint information 60 described in a prescribed format. This hint information angle analysis unit 80 is the same as that of the fifth embodiment, and thus a detailed description is omitted.
- the metadata regeneration condition setting unit 82 sets metadata regeneration conditions 83 based on the analysis result 81 output from the hint information analysis unit 80.
- the condition setting includes, for example, selection of a descriptor unnecessary for the client terminal 48A from the types of descriptors included in the metadata optimization hint information 60. If the client terminal 48A does not have a search function using a feature, a descriptor representing a feature such as a color histogram or motion complexity is not required.
- the maximum depth of the hierarchical structure described in the metadata optimization hint information 60 is considered. Based on the value, set the depth of the hierarchical structure that can be processed by the client terminal. Further, in another example, based on information of possible values of importance described in the metadata optimization hint information 60, a viewpoint of interest of the user and a threshold of importance of a scene to be selected are determined. Set. As described above, five discrete values ( ⁇ 0.0, 0.25, 0.5, 0.75) from the viewpoints of importance "Tea mA" and "Team B", respectively.
- the condition for metadata regeneration 83 set in the metadata regeneration condition setting unit 82 is transmitted to the metadata distribution server.
- the metadata distribution server reconfigures the metadata based on the conditions for metadata regeneration and the terminal performance of the client terminal. For example, if the maximum value of the depth of the hierarchical structure of the original metadata is 4 and the depth of the hierarchical structure that can be processed by the client terminal is set to 2 in the metadata regeneration condition, the hierarchy Restructure the metadata structure so that the maximum structure depth is 2.
- Metadata regeneration can be efficiently performed using metadata optimization hint information as in the fifth embodiment.
- conditions for metadata regeneration can be set using the metadata optimization hint information 60, appropriate metadata can be set according to the client terminal or application. Can be generated.
- Eighth Embodiment In the fifth embodiment or the sixth embodiment described above, a server that regenerates metadata using the metadata optimization hint information and distributes the regenerated metadata has been described.
- FIG. 23 is a block diagram illustrating a configuration of a content distribution server according to Embodiment 8 of the present invention.
- the content distribution server 500 A includes a hint information analysis unit 61, a metadata analysis unit 86, and a content reconstruction / delivery unit 88.
- the operation of the content distribution server according to the eighth embodiment will be described with reference to the drawings.
- the operation of the hint information analysis unit 61 is the same as that of the fifth embodiment, and a description thereof will be omitted.
- the metadata analysis unit 86 analyzes the metadata 49 using the analyzed metadata optimization hint information 62 output from the hint information analysis unit 61, and obtains information about the client terminal or Extract descriptions that meet the conditions 85 for content reconstruction such as user preferences.
- the analysis method using hint information is the same as that in Example 5 above, except that it reconstructs the content instead of using the extracted description to regenerate the metadata. is there.
- the description extracted by the metadata analysis unit 86 that is, the analyzed metadata 87 is output to the content reconstruction / distribution unit 88.
- Content reconstructing / distributing unit 88 reconstructs content 89 based on the description extracted by metadata analyzing unit 86.
- only video segments having a feature of importance of 0.5 or more are extracted from the metadata 49, and the content 90 composed only of scenes corresponding to the extracted video segments is reconstructed.
- the description of the extracted video segment includes the location of the corresponding content and the content of that video segment. Since the position (time information) within the content is described, it is possible to cut out the corresponding scene from the content, reconstruct it into one piece of content 90 and distribute it, but cut out the corresponding scene from the content Also, the extracted scenes can be distributed sequentially.
- the metadata optimization hint in which the list of the descriptors included in the metadata 49, the appearance position of the descriptor, the number of appearances, and the like are described Since the metadata is analyzed using the information 60, the analysis of the metadata 49 itself for regenerating the metadata can be omitted. In addition, it is possible to omit the analysis of descriptors that do not match the regeneration conditions based on the appearance position and the number of appearances, and to analyze metadata when regenerating and delivering content suitable for client terminals and user preferences. In addition, the processing cost (processing amount, memory usage, etc.) associated with content reconfiguration can be reduced. Industrial applicability
- the present invention divides a multimedia content including a moving image and a sound into a plurality of scenes, edits the plurality of divided scenes, and describes a hierarchical structure of the multimedia content. Since the metadata is generated, it is possible to generate the metadata describing the hierarchical structure of the multimedia content including the video data and the like.
Description
Claims
Priority Applications (12)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA2482431A CA2482431C (en) | 2002-04-12 | 2003-03-20 | An apparatus to edit, reproduce, deliver, search and re-generate condition settings for metadata |
JP2003585438A JPWO2003088665A1 (ja) | 2002-04-12 | 2003-03-20 | メタデータ編集装置、メタデータ再生装置、メタデータ配信装置、メタデータ検索装置、メタデータ再生成条件設定装置、及びメタデータ配信方法 |
US10/510,548 US7826709B2 (en) | 2002-04-12 | 2003-03-20 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
AU2003221185A AU2003221185A1 (en) | 2002-04-12 | 2003-03-20 | Meta data edition device, meta data reproduction device, meta data distribution device, meta data search device, meta data reproduction condition setting device, and meta data distribution method |
KR1020047016204A KR100912984B1 (ko) | 2002-04-12 | 2003-03-20 | 메타데이터 편집 장치, 메타데이터 재생 장치, 메타데이터 배신 장치, 메타데이터 검색 장치, 메타데이터 재생성 조건 설정 장치, 콘텐츠 배신 장치, 메타데이터 배신 방법, 메타데이터 재생성 장치, 메타데이터 재생성 방법 |
EP03712804A EP1496701A4 (en) | 2002-04-12 | 2003-03-20 | METADATA EDITING DEVICE, METADATA REPRODUCTION DEVICE, METADATA DISTRIBUTION APPARATUS, METADA SEARCHING DEVICE, METADATA REPRODUCTION STATUS DISPLAYING DEVICE AND METADATA DISTRIBUTION METHOD |
US11/980,544 US20080065697A1 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US11/980,624 US20080071837A1 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US11/980,514 US20080075431A1 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US11/980,523 US20080071836A1 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US11/980,648 US8811800B2 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US12/555,510 US20100005070A1 (en) | 2002-04-12 | 2009-09-08 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, and metadata delivery method and hint information description method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002-110259 | 2002-04-12 | ||
JP2002110259 | 2002-04-12 | ||
JP2002-178169 | 2002-06-19 | ||
JP2002178169 | 2002-06-19 |
Related Child Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10510548 A-371-Of-International | 2003-03-20 | ||
US11/980,523 Division US20080071836A1 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US11/980,514 Division US20080075431A1 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US11/980,648 Division US8811800B2 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US11/980,624 Division US20080071837A1 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US11/980,544 Division US20080065697A1 (en) | 2002-04-12 | 2007-10-31 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method |
US12/555,510 Division US20100005070A1 (en) | 2002-04-12 | 2009-09-08 | Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, and metadata delivery method and hint information description method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2003088665A1 true WO2003088665A1 (fr) | 2003-10-23 |
Family
ID=29253534
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/003450 WO2003088665A1 (fr) | 2002-04-12 | 2003-03-20 | Dispositif d'edition de metadonnees, dispositif de reproduction de metadonnees, dispositif de distribution de metadonnees, dispositif de recherche de metadonnees, dispositif d'etablissement de conditions de reproduction de metadonnees, et procede de distribution de metadonnees |
Country Status (10)
Country | Link |
---|---|
US (7) | US7826709B2 (ja) |
EP (7) | EP1496701A4 (ja) |
JP (5) | JPWO2003088665A1 (ja) |
KR (4) | KR100918725B1 (ja) |
CN (1) | CN100367794C (ja) |
AU (1) | AU2003221185A1 (ja) |
CA (2) | CA2482431C (ja) |
SG (1) | SG152905A1 (ja) |
TW (1) | TWI231140B (ja) |
WO (1) | WO2003088665A1 (ja) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005192196A (ja) * | 2003-11-12 | 2005-07-14 | Sony Internatl Europ Gmbh | ビデオ信号のサブセットの定義方法及び自動番組要約装置 |
JP2006066015A (ja) * | 2004-08-30 | 2006-03-09 | Sony Corp | 画像情報記録装置および画像情報表示装置 |
JP2007041861A (ja) * | 2005-08-03 | 2007-02-15 | Sharp Corp | コンテンツ編集装置、コンピュータ読み取り可能なプログラム及びそれを記録した記録媒体 |
JP2007527142A (ja) * | 2003-07-08 | 2007-09-20 | 松下電器産業株式会社 | コンテンツ蓄積システム、ホームサーバ装置、情報提供装置、集積回路、及びプログラム |
JP2008099012A (ja) * | 2006-10-12 | 2008-04-24 | Mitsubishi Electric Corp | コンテンツ再生システム及びコンテンツ蓄積システム |
JP2008526071A (ja) * | 2004-12-24 | 2008-07-17 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 番組検索情報を編集する方法および装置 |
JP2008244656A (ja) * | 2007-03-26 | 2008-10-09 | Ntt Docomo Inc | 遠隔視聴システム及び遠隔視聴方法 |
WO2008136466A1 (ja) * | 2007-05-01 | 2008-11-13 | Dep Co., Ltd. | 動画編集装置 |
JP2009152927A (ja) * | 2007-12-21 | 2009-07-09 | Sony Corp | コンテンツの再生方法および再生システム |
JP2009171480A (ja) * | 2008-01-21 | 2009-07-30 | Hitachi Ltd | 映像記録再生装置及び映像再生装置 |
JPWO2008041629A1 (ja) * | 2006-09-29 | 2010-02-04 | ソニー株式会社 | 再生装置および方法、情報生成装置および方法、データ格納媒体、データ構造、プログラム格納媒体、並びにプログラム |
JP4978894B2 (ja) * | 2005-01-25 | 2012-07-18 | 日本電気株式会社 | 構造化文書検索装置、構造化文書検索方法および構造化文書検索プログラム |
JP2013051707A (ja) * | 2007-04-17 | 2013-03-14 | Thomson Licensing | データストリームにおけるビデオデータ及び関連するメタデータを送信する方法 |
CN103309933A (zh) * | 2005-07-19 | 2013-09-18 | 苹果公司 | 用于媒体数据传输的方法和设备 |
JP2014197879A (ja) * | 2007-01-05 | 2014-10-16 | ソニック アイピー, インコーポレイテッド | プログレッシブ再生を含む映像分配システム |
Families Citing this family (131)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050162515A1 (en) * | 2000-10-24 | 2005-07-28 | Objectvideo, Inc. | Video surveillance system |
US7116716B2 (en) * | 2002-11-01 | 2006-10-03 | Microsoft Corporation | Systems and methods for generating a motion attention model |
TWI310545B (en) * | 2003-10-04 | 2009-06-01 | Samsung Electronics Co Ltd | Storage medium storing search information and reproducing apparatus |
KR20070007788A (ko) * | 2004-01-30 | 2007-01-16 | 마츠시타 덴끼 산교 가부시키가이샤 | 콘텐츠 재생장치 |
CA2568060A1 (en) * | 2004-05-25 | 2005-12-08 | Samsung Electronics Co., Ltd. | Method of reproducing multimedia data using musicphotovideo profiles and reproducing apparatus using the method |
US20050289593A1 (en) * | 2004-05-26 | 2005-12-29 | Skipjam Corp. | Method and system for displaying and selecting content of an electronic program guide |
US8953908B2 (en) * | 2004-06-22 | 2015-02-10 | Digimarc Corporation | Metadata management and generation using perceptual features |
US8156123B2 (en) * | 2004-06-25 | 2012-04-10 | Apple Inc. | Method and apparatus for processing metadata |
US9053754B2 (en) * | 2004-07-28 | 2015-06-09 | Microsoft Technology Licensing, Llc | Thumbnail generation and presentation for recorded TV programs |
KR100619064B1 (ko) | 2004-07-30 | 2006-08-31 | 삼성전자주식회사 | 메타 데이터를 포함하는 저장 매체, 그 재생 장치 및 방법 |
KR100565080B1 (ko) * | 2004-09-13 | 2006-03-30 | 삼성전자주식회사 | 대표 타이틀 정보가 부가된 메타 데이터를 포함한 av데이터를 기록한 정보저장매체, 그 재생장치 및 메타데이터 검색방법 |
KR100602388B1 (ko) * | 2004-11-17 | 2006-07-20 | 주식회사 픽스트리 | 엠펙-21 멀티미디어 프레임워크에서의 리소스 참조 방법 |
KR20060065476A (ko) | 2004-12-10 | 2006-06-14 | 엘지전자 주식회사 | 기록매체, 기록매체 내의 콘텐츠 서치방법 및 기록매체재생방법과 재생장치 |
JP4349277B2 (ja) * | 2004-12-24 | 2009-10-21 | 株式会社日立製作所 | 動画再生装置 |
US8360884B2 (en) | 2005-01-07 | 2013-01-29 | Electronics And Telecommunications Research Institute | Apparatus and method for providing adaptive broadcast service using game metadata |
FR2883441A1 (fr) * | 2005-03-17 | 2006-09-22 | Thomson Licensing Sa | Procede de selection de parties d'une emission audiovisuelle et dispositif mettant en oeuvre le procede |
TWI309389B (en) * | 2005-05-06 | 2009-05-01 | Sunplus Technology Co Ltd | Digital audio-video information reproducing apparatus and reproducing method thereof |
KR100654455B1 (ko) | 2005-05-26 | 2006-12-06 | 삼성전자주식회사 | 확장형 자막 파일을 이용하여 부가정보를 제공하는 장치 및방법 |
EP2894831B1 (en) * | 2005-06-27 | 2020-06-03 | Core Wireless Licensing S.a.r.l. | Transport mechanisms for dynamic rich media scenes |
US20080130989A1 (en) * | 2005-07-22 | 2008-06-05 | Mitsubishi Electric Corporation | Image encoder and image decoder, image encoding method and image decoding method, image encoding program and image decoding program, and computer readable recording medium recorded with image encoding program and computer readable recording medium recorded with image decoding program |
US20070078898A1 (en) * | 2005-09-30 | 2007-04-05 | Yahoo! Inc. | Server-based system and method for retrieving tagged portions of media files |
US20070078883A1 (en) * | 2005-09-30 | 2007-04-05 | Yahoo! Inc. | Using location tags to render tagged portions of media files |
US20070078896A1 (en) * | 2005-09-30 | 2007-04-05 | Yahoo! Inc. | Identifying portions within media files with location tags |
US8271551B2 (en) * | 2005-10-13 | 2012-09-18 | Lg Electronics Inc. | Method and apparatus for encoding/decoding |
US7743363B2 (en) * | 2005-10-13 | 2010-06-22 | Microsoft Corporation | Extensible meta-data |
US8180826B2 (en) * | 2005-10-31 | 2012-05-15 | Microsoft Corporation | Media sharing and authoring on the web |
US8856118B2 (en) * | 2005-10-31 | 2014-10-07 | Qwest Communications International Inc. | Creation and transmission of rich content media |
JPWO2007052395A1 (ja) * | 2005-10-31 | 2009-04-30 | シャープ株式会社 | 視聴環境制御装置、視聴環境制御システム、視聴環境制御方法、データ送信装置及びデータ送信方法 |
US8196032B2 (en) * | 2005-11-01 | 2012-06-05 | Microsoft Corporation | Template-based multimedia authoring and sharing |
JP2007179435A (ja) * | 2005-12-28 | 2007-07-12 | Sony Corp | 情報処理装置、情報処理方法、プログラム |
US7421455B2 (en) * | 2006-02-27 | 2008-09-02 | Microsoft Corporation | Video search and services |
US20070204238A1 (en) * | 2006-02-27 | 2007-08-30 | Microsoft Corporation | Smart Video Presentation |
US7873946B2 (en) * | 2006-03-23 | 2011-01-18 | Oracle America, Inc. | Scalable vector graphics, tree and tab as drag and drop objects |
JP2007265341A (ja) * | 2006-03-30 | 2007-10-11 | Sony Corp | コンテンツ利用方法、コンテンツ利用装置、コンテンツ記録方法、コンテンツ記録装置、コンテンツ提供システム、コンテンツ受信方法、コンテンツ受信装置およびコンテンツデータフォーマット |
JP4377887B2 (ja) * | 2006-03-30 | 2009-12-02 | 株式会社東芝 | 映像分割装置 |
US8549492B2 (en) * | 2006-04-21 | 2013-10-01 | Microsoft Corporation | Machine declarative language for formatted data processing |
US7827155B2 (en) * | 2006-04-21 | 2010-11-02 | Microsoft Corporation | System for processing formatted data |
JP4760572B2 (ja) * | 2006-06-30 | 2011-08-31 | ソニー株式会社 | 編集装置および編集方法、並びにプログラム |
US8275814B2 (en) | 2006-07-12 | 2012-09-25 | Lg Electronics Inc. | Method and apparatus for encoding/decoding signal |
US20080019281A1 (en) * | 2006-07-21 | 2008-01-24 | Microsoft Corporation | Reuse of available source data and localizations |
US7769363B2 (en) * | 2006-08-01 | 2010-08-03 | Chew Gregory T H | User-initiated communications during multimedia content playback on a mobile communications device |
US20080065693A1 (en) * | 2006-09-11 | 2008-03-13 | Bellsouth Intellectual Property Corporation | Presenting and linking segments of tagged media files in a media services network |
WO2008032739A1 (fr) * | 2006-09-12 | 2008-03-20 | Panasonic Corporation | Dispositif de formation d'image de contenu |
WO2008048067A1 (en) | 2006-10-19 | 2008-04-24 | Lg Electronics Inc. | Encoding method and apparatus and decoding method and apparatus |
KR101317204B1 (ko) * | 2006-11-27 | 2013-10-10 | 삼성전자주식회사 | 동적 영상물의 프레임 정보를 생성하는 방법 및 이를이용한 장치 |
KR100827241B1 (ko) * | 2006-12-18 | 2008-05-07 | 삼성전자주식회사 | 동적 영상물을 생성하기 위한 템플릿을 편집하는 장치 및방법 |
FR2910769B1 (fr) * | 2006-12-21 | 2009-03-06 | Thomson Licensing Sas | Procede de creation d'un resume d'un document audiovisuel comportant un sommaire et des reportages, et recepteur mettant en oeuvre le procede |
US8671346B2 (en) * | 2007-02-09 | 2014-03-11 | Microsoft Corporation | Smart video thumbnail |
KR100864524B1 (ko) * | 2007-02-14 | 2008-10-21 | 주식회사 드리머 | 디지털 방송 데이터 어플리케이션 실행 방법 및 이를실현시키기 위한 프로그램을 기록한 컴퓨터로 판독 가능한기록 매체 |
JP4469868B2 (ja) * | 2007-03-27 | 2010-06-02 | 株式会社東芝 | 説明表現付加装置、プログラムおよび説明表現付加方法 |
WO2008129600A1 (ja) * | 2007-04-05 | 2008-10-30 | Sony Computer Entertainment Inc. | コンテンツ再生装置、コンテンツ配信装置、コンテンツ配信システム及びメタデータ生成方法 |
KR100935862B1 (ko) * | 2007-07-06 | 2010-01-07 | 드리머 | 매체 재생 장치 기반 컨텐츠 제공 시스템 |
JP4360428B2 (ja) * | 2007-07-19 | 2009-11-11 | ソニー株式会社 | 記録装置、記録方法、コンピュータプログラムおよび記録媒体 |
JP4420085B2 (ja) * | 2007-08-20 | 2010-02-24 | ソニー株式会社 | データ処理装置、データ処理方法、プログラムおよび記録媒体 |
KR101268987B1 (ko) * | 2007-09-11 | 2013-05-29 | 삼성전자주식회사 | 메타데이터를 자동적으로 생성/갱신하는 멀티미디어 데이터기록 방법 및 장치 |
KR20090031142A (ko) * | 2007-09-21 | 2009-03-25 | 삼성전자주식회사 | 컨텐츠 생성시 관련된 컨텐츠를 표시하는 gui 제공방법및 이를 적용한 멀티미디어 기기 |
KR101034758B1 (ko) * | 2007-10-04 | 2011-05-17 | 에스케이 텔레콤주식회사 | 통합 멀티미디어 파일의 초기 실행 방법과 이를 위한시스템 |
US20090158157A1 (en) * | 2007-12-14 | 2009-06-18 | Microsoft Corporation | Previewing recorded programs using thumbnails |
KR20090079010A (ko) * | 2008-01-16 | 2009-07-21 | 삼성전자주식회사 | 프로그램 정보 표시 방법 및 장치 |
JP5188260B2 (ja) * | 2008-05-08 | 2013-04-24 | キヤノン株式会社 | 画像処理装置、画像処理方法ならびにそのプログラムおよび記憶媒体 |
US20090287655A1 (en) * | 2008-05-13 | 2009-11-19 | Bennett James D | Image search engine employing user suitability feedback |
JP2011523309A (ja) * | 2008-06-06 | 2011-08-04 | ディヴィクス インコーポレイテッド | マルチメディアファイルのためのフォントファイル最適化システム及び方法 |
US20090315981A1 (en) * | 2008-06-24 | 2009-12-24 | Samsung Electronics Co., Ltd. | Image processing method and apparatus |
KR101539935B1 (ko) * | 2008-06-24 | 2015-07-28 | 삼성전자주식회사 | 3차원 비디오 영상 처리 방법 및 장치 |
US20090315980A1 (en) * | 2008-06-24 | 2009-12-24 | Samsung Electronics Co., | Image processing method and apparatus |
US20090317062A1 (en) * | 2008-06-24 | 2009-12-24 | Samsung Electronics Co., Ltd. | Image processing method and apparatus |
JP5649273B2 (ja) * | 2008-08-25 | 2015-01-07 | 株式会社東芝 | 情報処理装置、情報処理方法および情報処理プログラム |
JP5091806B2 (ja) * | 2008-09-01 | 2012-12-05 | 株式会社東芝 | 映像処理装置及びその方法 |
JP5322550B2 (ja) * | 2008-09-18 | 2013-10-23 | 三菱電機株式会社 | 番組推奨装置 |
US8239359B2 (en) * | 2008-09-23 | 2012-08-07 | Disney Enterprises, Inc. | System and method for visual search in a video media player |
KR101592943B1 (ko) * | 2008-12-11 | 2016-02-12 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 데이터 관리 방법 |
KR20110032610A (ko) * | 2009-09-23 | 2011-03-30 | 삼성전자주식회사 | 장면 분할 장치 및 방법 |
KR20110047768A (ko) | 2009-10-30 | 2011-05-09 | 삼성전자주식회사 | 멀티미디어 컨텐츠 재생 장치 및 방법 |
CN102065237B (zh) * | 2009-11-13 | 2014-12-24 | 新奥特(北京)视频技术有限公司 | 一种方便处理字幕文件的字幕机 |
WO2011059275A2 (en) * | 2009-11-13 | 2011-05-19 | Samsung Electronics Co., Ltd. | Method and apparatus for managing data |
JP2011130279A (ja) * | 2009-12-18 | 2011-06-30 | Sony Corp | コンテンツ提供サーバ、コンテンツ再生装置、コンテンツ提供方法、コンテンツ再生方法、プログラムおよびコンテンツ提供システム |
EP2517466A4 (en) * | 2009-12-21 | 2013-05-08 | Estefano Emilio Isaias | SYSTEM AND METHOD FOR VIDEO SEGMENT MANAGEMENT AND DISTRIBUTION |
JP2011188342A (ja) * | 2010-03-10 | 2011-09-22 | Sony Corp | 情報処理装置、情報処理方法及びプログラム |
JP5913788B2 (ja) * | 2010-03-25 | 2016-04-27 | ソニー株式会社 | コンテンツサーバ、コンテンツ提供システム及びコンテンツ提供方法 |
KR101746453B1 (ko) * | 2010-04-12 | 2017-06-13 | 삼성전자주식회사 | 실감 효과 처리 시스템 및 방법 |
US9191639B2 (en) | 2010-04-12 | 2015-11-17 | Adobe Systems Incorporated | Method and apparatus for generating video descriptions |
US9276986B2 (en) * | 2010-04-27 | 2016-03-01 | Nokia Technologies Oy | Systems, methods, and apparatuses for facilitating remote data processing |
GB2481185A (en) * | 2010-05-28 | 2011-12-21 | British Broadcasting Corp | Processing audio-video data to produce multi-dimensional complex metadata |
US8806340B2 (en) * | 2010-09-01 | 2014-08-12 | Hulu, LLC | Method and apparatus for embedding media programs having custom user selectable thumbnails |
JP5671288B2 (ja) * | 2010-09-24 | 2015-02-18 | 任天堂株式会社 | 情報処理装置、情報処理プログラム、情報処理方法及び情報処理システム |
US9753609B2 (en) * | 2010-12-03 | 2017-09-05 | Facebook, Inc. | User interface with media wheel facilitating viewing of media objects |
US8587672B2 (en) | 2011-01-31 | 2013-11-19 | Home Box Office, Inc. | Real-time visible-talent tracking system |
US9264484B1 (en) * | 2011-02-09 | 2016-02-16 | Google Inc. | Attributing preferences to locations for serving content |
CN102143001B (zh) * | 2011-04-02 | 2013-10-09 | 西南科技大学 | 一种基于语义理解的音频资源管理方法 |
GB2491894A (en) * | 2011-06-17 | 2012-12-19 | Ant Software Ltd | Processing supplementary interactive content in a television system |
US9146909B2 (en) * | 2011-07-27 | 2015-09-29 | Qualcomm Incorporated | Web browsing enhanced by cloud computing |
KR101315608B1 (ko) * | 2011-09-27 | 2013-10-18 | 엘지전자 주식회사 | 컨텐츠 관리 방법 및 그를 이용한 영상 표시 장치 |
US20130089301A1 (en) * | 2011-10-06 | 2013-04-11 | Chi-cheng Ju | Method and apparatus for processing video frames image with image registration information involved therein |
US9536044B2 (en) | 2011-12-06 | 2017-01-03 | Microsoft Technology Licensing, Llc | Metadata extraction pipeline |
US9525642B2 (en) | 2012-01-31 | 2016-12-20 | Db Networks, Inc. | Ordering traffic captured on a data connection |
US9100291B2 (en) * | 2012-01-31 | 2015-08-04 | Db Networks, Inc. | Systems and methods for extracting structured application data from a communications link |
TWI461955B (zh) * | 2012-02-20 | 2014-11-21 | Univ Nat Cheng Kung | 惡意程式破壞系統及其破壞方法 |
CN104126307B (zh) | 2012-02-29 | 2018-02-06 | 杜比实验室特许公司 | 用于改善的图像处理和内容传递的图像元数据创建处理器及方法 |
KR101332834B1 (ko) * | 2012-04-03 | 2013-11-27 | 모젼스랩(주) | 온톨로지 기반 전시관련 서비스 제공방법 |
KR101952260B1 (ko) | 2012-04-03 | 2019-02-26 | 삼성전자주식회사 | 다수의 동영상 썸네일을 동시에 표시하기 위한 영상 재생 장치 및 방법 |
SE537206C2 (sv) * | 2012-04-11 | 2015-03-03 | Vidispine Ab | Metod och system för sökningar i digitalt innehåll |
EP2680601A1 (en) * | 2012-06-29 | 2014-01-01 | Moda e Technologia S.r.l. | Video streams management system for a television apparatus |
US20140136545A1 (en) | 2012-10-05 | 2014-05-15 | International Business Machines Corporation | Multi-tier Indexing Methodology for Scalable Mobile Device Data Collection |
TWI474201B (zh) * | 2012-10-17 | 2015-02-21 | Inst Information Industry | Construction system scene fragment, method and recording medium |
KR101537665B1 (ko) * | 2013-02-26 | 2015-07-20 | 주식회사 알티캐스트 | 콘텐츠 재생 방법 및 장치 |
EP2809077A1 (en) * | 2013-05-27 | 2014-12-03 | Thomson Licensing | Method and apparatus for classification of a file |
EP2809078A1 (en) * | 2013-05-27 | 2014-12-03 | Thomson Licensing | Method and apparatus for managing metadata files |
US9197926B2 (en) | 2013-06-05 | 2015-11-24 | International Business Machines Corporation | Location based determination of related content |
EP3028446A1 (en) | 2013-07-30 | 2016-06-08 | Dolby Laboratories Licensing Corporation | System and methods for generating scene stabilized metadata |
US20150331551A1 (en) * | 2014-05-14 | 2015-11-19 | Samsung Electronics Co., Ltd. | Image display apparatus, image display method, and computer-readable recording medium |
US20170091197A1 (en) * | 2014-05-19 | 2017-03-30 | Huawei Technologies Co., Ltd. | Multimedia Display Method, Apparatus, and Device |
KR102190233B1 (ko) | 2014-10-06 | 2020-12-11 | 삼성전자주식회사 | 영상 처리 장치 및 이의 영상 처리 방법 |
KR101640317B1 (ko) | 2014-11-20 | 2016-07-19 | 소프트온넷(주) | 오디오 및 비디오 데이터를 포함하는 영상의 저장 및 검색 장치와 저장 및 검색 방법 |
KR102380979B1 (ko) * | 2015-01-05 | 2022-04-01 | 삼성전자 주식회사 | 이미지의 메타데이터 관리 방법 및 장치 |
KR102306538B1 (ko) * | 2015-01-20 | 2021-09-29 | 삼성전자주식회사 | 콘텐트 편집 장치 및 방법 |
JP2016144080A (ja) * | 2015-02-03 | 2016-08-08 | ソニー株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
KR102310241B1 (ko) | 2015-04-29 | 2021-10-08 | 삼성전자주식회사 | 소스 디바이스, 그의 제어 방법, 싱크 디바이스 및 그의 화질 개선 처리 방법 |
US9554160B2 (en) * | 2015-05-18 | 2017-01-24 | Zepp Labs, Inc. | Multi-angle video editing based on cloud video sharing |
US10462524B2 (en) * | 2015-06-23 | 2019-10-29 | Facebook, Inc. | Streaming media presentation system |
US9917870B2 (en) | 2015-06-23 | 2018-03-13 | Facebook, Inc. | Streaming media presentation system |
US10375443B2 (en) | 2015-07-31 | 2019-08-06 | Rovi Guides, Inc. | Method for enhancing a user viewing experience when consuming a sequence of media |
US9966110B2 (en) * | 2015-10-16 | 2018-05-08 | Tribune Broadcasting Company, Llc | Video-production system with DVE feature |
US10645465B2 (en) * | 2015-12-21 | 2020-05-05 | Centurylink Intellectual Property Llc | Video file universal identifier for metadata resolution |
US11023417B2 (en) * | 2017-05-30 | 2021-06-01 | Home Box Office, Inc. | Video content graph including enhanced metadata |
CN108829881B (zh) * | 2018-06-27 | 2021-12-03 | 深圳市腾讯网络信息技术有限公司 | 视频标题生成方法及装置 |
JP6688368B1 (ja) * | 2018-11-13 | 2020-04-28 | 西日本電信電話株式会社 | 映像コンテンツ構造化装置、映像コンテンツ構造化方法、及びコンピュータプログラム |
CN112150778A (zh) * | 2019-06-29 | 2020-12-29 | 华为技术有限公司 | 环境音处理方法及相关装置 |
KR102250642B1 (ko) * | 2019-10-31 | 2021-05-11 | 테크온미디어(주) | 효율적인 콘텐츠 유통을 위한 탈중앙화된 콘텐츠 분산 관리 시스템 및 이를 수행하기 위한 컴퓨팅 장치 |
JP2021132281A (ja) * | 2020-02-19 | 2021-09-09 | Jcc株式会社 | メタデータ生成システムおよびメタデータ生成方法 |
US20210319230A1 (en) * | 2020-04-10 | 2021-10-14 | Gracenote, Inc. | Keyframe Extractor |
US11526612B2 (en) | 2020-09-22 | 2022-12-13 | International Business Machines Corporation | Computer file metadata segmentation security system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001167099A (ja) * | 1999-12-07 | 2001-06-22 | Nippon Hoso Kyokai <Nhk> | 映像・音声オブジェクトを基本としたデータベース装置 |
JP2001320693A (ja) * | 2000-05-12 | 2001-11-16 | Sony Corp | サービス提供装置および方法、受信端末装置および方法、サービス提供システム |
JP2001357008A (ja) * | 2000-06-14 | 2001-12-26 | Mitsubishi Electric Corp | コンテンツ検索配信装置およびコンテンツ検索配信方法 |
JP2002041541A (ja) * | 2000-05-19 | 2002-02-08 | Jisedai Joho Hoso System Kenkyusho:Kk | 映像検索装置 |
JP2002051287A (ja) * | 2000-08-04 | 2002-02-15 | Sony Corp | 番組録画支援システムおよび番組録画支援方法、並びに、番組視聴サービスシステムおよび番組視聴サービス提供方法 |
Family Cites Families (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5119465A (en) * | 1989-06-19 | 1992-06-02 | Digital Equipment Corporation | System for selectively converting plurality of source data structures through corresponding source intermediate structures, and target intermediate structures into selected target structure |
US5758180A (en) * | 1993-04-15 | 1998-05-26 | Sony Corporation | Block resizing function for multi-media editing which moves other blocks in response to the resize only as necessary |
JP3194837B2 (ja) | 1994-07-19 | 2001-08-06 | 日本電信電話株式会社 | 代表画面抽出方法および装置 |
JP3144285B2 (ja) | 1995-11-30 | 2001-03-12 | 松下電器産業株式会社 | 映像処理装置 |
JP3529540B2 (ja) * | 1996-03-07 | 2004-05-24 | Kddi株式会社 | 動画像検索装置 |
WO1997034240A1 (en) * | 1996-03-15 | 1997-09-18 | University Of Massachusetts | Compact tree for storage and retrieval of structured hypermedia documents |
US5986675A (en) * | 1996-05-24 | 1999-11-16 | Microsoft Corporation | System and method for animating an object in three-dimensional space using a two-dimensional input device |
US20010038719A1 (en) * | 1996-10-14 | 2001-11-08 | Nikon Corporation | Information processing apparatus |
JP3633229B2 (ja) * | 1997-09-01 | 2005-03-30 | セイコーエプソン株式会社 | 発光素子の製造方法および多色表示装置の製造方法 |
AUPO918697A0 (en) * | 1997-09-15 | 1997-10-09 | Canon Information Systems Research Australia Pty Ltd | Enhanced information gathering apparatus and method |
KR100284574B1 (ko) * | 1997-10-27 | 2001-03-15 | 정선종 | 객체자동저장시스템및그운용방법 |
US6134558A (en) * | 1997-10-31 | 2000-10-17 | Oracle Corporation | References that indicate where global database objects reside |
US6751623B1 (en) * | 1998-01-26 | 2004-06-15 | At&T Corp. | Flexible interchange of coded multimedia facilitating access and streaming |
JPH11238071A (ja) * | 1998-02-20 | 1999-08-31 | Toshiba Corp | ダイジェスト作成装置及びダイジェスト作成方法 |
US6085198A (en) * | 1998-06-05 | 2000-07-04 | Sun Microsystems, Inc. | Integrated three-tier application framework with automated class and table generation |
US6711590B1 (en) * | 1998-07-10 | 2004-03-23 | Canon Kabushiki Kaisha | Linking metadata with a time-sequential digital signal |
KR100279735B1 (ko) * | 1998-11-20 | 2001-02-01 | 정선종 | 메타데이터를 이용한 멀티미디어 컨텐츠 전달방법 |
JP2000253337A (ja) * | 1999-02-24 | 2000-09-14 | Sony Corp | 画面の制御方法および画面の制御装置および映像再生方法および映像再生装置および映像情報の記録方法および映像情報の記録装置およびコンピュータが読み取り可能な記録媒体 |
US7362946B1 (en) * | 1999-04-12 | 2008-04-22 | Canon Kabushiki Kaisha | Automated visual image editing system |
JP4227241B2 (ja) * | 1999-04-13 | 2009-02-18 | キヤノン株式会社 | 画像処理装置及び方法 |
JP2001008136A (ja) * | 1999-06-21 | 2001-01-12 | Victor Co Of Japan Ltd | マルチメディアデータのオーサリング装置 |
JP2001028722A (ja) | 1999-07-13 | 2001-01-30 | Matsushita Electric Ind Co Ltd | 動画像管理装置及び動画像管理システム |
JP2001111957A (ja) * | 1999-08-16 | 2001-04-20 | Univ Of Washington | ビデオシーケンスの対話型処理方法と、その記憶媒体およびシステム |
WO2001020908A1 (en) * | 1999-09-16 | 2001-03-22 | Ixl Enterprises, Inc. | System and method for linking media content |
KR100373371B1 (ko) | 1999-09-20 | 2003-02-25 | 한국전자통신연구원 | 메타데이터의 중요도 결정기법을 적용한 비디오 데이터 검색방법 |
JP3738631B2 (ja) * | 1999-09-27 | 2006-01-25 | 三菱電機株式会社 | 画像検索システムおよび画像検索方法 |
WO2001024046A2 (en) | 1999-09-29 | 2001-04-05 | Xml-Global Technologies, Inc. | Authoring, altering, indexing, storing and retrieving electronic documents embedded with contextual markup |
KR100371813B1 (ko) | 1999-10-11 | 2003-02-11 | 한국전자통신연구원 | 효율적인 비디오 개관 및 브라우징을 위한 요약 비디오 기술구조 및 이의 기록매체, 이를 이용한 요약 비디오 기술 데이터 생성 방법 및 생성시스템, 요약 비디오 기술 데이터의 브라우징 장치 및 브라우징 방법. |
KR100305964B1 (ko) | 1999-10-22 | 2001-11-02 | 구자홍 | 사용자 적응적인 다단계 요약 스트림 제공방법 |
JP3478331B2 (ja) * | 1999-10-29 | 2003-12-15 | 株式会社リコー | 構造表示方法およびその方法をコンピュータに実行させるプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2001167109A (ja) | 1999-12-08 | 2001-06-22 | Kddi Corp | オーディオビデオ情報の特徴記述群構成方法 |
US7212972B2 (en) * | 1999-12-08 | 2007-05-01 | Ddi Corporation | Audio features description method and audio video features description collection construction method |
AU780811B2 (en) | 2000-03-13 | 2005-04-21 | Sony Corporation | Method and apparatus for generating compact transcoding hints metadata |
KR100739031B1 (ko) * | 2000-03-27 | 2007-07-25 | 주식회사 큐론 | 멀티미디어 검색시스템에서 mpeg-7 표준 메타데이터의 은닉 및 검출 방법과 이를 이용한 멀티미디어 데이터의 검색 방법 |
JP3517631B2 (ja) * | 2000-05-08 | 2004-04-12 | 株式会社リコー | ダイジェスト映像蓄積方法およびダイジェスト映像蓄積装置 |
JP4953496B2 (ja) * | 2000-05-15 | 2012-06-13 | ソニー株式会社 | コンテンツ検索・提示システム及び方法、並びに、ソフトウェア記憶媒体 |
US6646676B1 (en) * | 2000-05-17 | 2003-11-11 | Mitsubishi Electric Research Laboratories, Inc. | Networked surveillance and control system |
JP2001326901A (ja) | 2000-05-18 | 2001-11-22 | Sharp Corp | 動画像シーン情報管理装置 |
AUPQ867700A0 (en) | 2000-07-10 | 2000-08-03 | Canon Kabushiki Kaisha | Delivering multimedia descriptions |
KR100369222B1 (ko) * | 2000-07-26 | 2003-01-24 | (학)창성학원 | 디지털 비디오 검색 및 저작기 |
US7054508B2 (en) * | 2000-08-03 | 2006-05-30 | Canon Kabushiki Kaisha | Data editing apparatus and method |
US6959326B1 (en) * | 2000-08-24 | 2005-10-25 | International Business Machines Corporation | Method, system, and program for gathering indexable metadata on content at a data repository |
US20020087987A1 (en) * | 2000-11-16 | 2002-07-04 | Dudkiewicz Gil Gavriel | System and method for creating and editing a viewer profile used in determining the desirability of video programming events |
JP4536940B2 (ja) * | 2001-01-26 | 2010-09-01 | キヤノン株式会社 | 画像処理装置、画像処理方法、記憶媒体、及びコンピュータプログラム |
US7254570B2 (en) * | 2001-03-21 | 2007-08-07 | Nokia Corporation | Query resolution system and service |
JP2003067397A (ja) | 2001-06-11 | 2003-03-07 | Matsushita Electric Ind Co Ltd | コンテンツ管理システム |
US20030088876A1 (en) * | 2001-11-08 | 2003-05-08 | Liberate Technologies | Video on demand gateway |
US20030110501A1 (en) * | 2001-12-12 | 2003-06-12 | Rafey Richter A. | Personalizing media presentations based on a target duration |
JP3826048B2 (ja) * | 2002-02-15 | 2006-09-27 | キヤノン株式会社 | 情報処理装置及び方法 |
JP3826043B2 (ja) * | 2002-01-31 | 2006-09-27 | キヤノン株式会社 | 情報処理装置及び方法 |
JP2007179435A (ja) * | 2005-12-28 | 2007-07-12 | Sony Corp | 情報処理装置、情報処理方法、プログラム |
US8386438B2 (en) * | 2009-03-19 | 2013-02-26 | Symantec Corporation | Method for restoring data from a monolithic backup |
-
2003
- 2003-03-20 KR KR1020087012071A patent/KR100918725B1/ko active IP Right Grant
- 2003-03-20 CA CA2482431A patent/CA2482431C/en not_active Expired - Fee Related
- 2003-03-20 EP EP03712804A patent/EP1496701A4/en not_active Withdrawn
- 2003-03-20 CA CA2664732A patent/CA2664732C/en not_active Expired - Fee Related
- 2003-03-20 KR KR1020067015167A patent/KR100986401B1/ko active IP Right Grant
- 2003-03-20 AU AU2003221185A patent/AU2003221185A1/en not_active Abandoned
- 2003-03-20 EP EP10002877A patent/EP2202649A1/en not_active Withdrawn
- 2003-03-20 US US10/510,548 patent/US7826709B2/en not_active Expired - Fee Related
- 2003-03-20 EP EP10002878A patent/EP2202978A1/en not_active Withdrawn
- 2003-03-20 EP EP10002874A patent/EP2202977A1/en not_active Withdrawn
- 2003-03-20 EP EP10002875A patent/EP2202648A1/en not_active Withdrawn
- 2003-03-20 SG SG200505671-8A patent/SG152905A1/en unknown
- 2003-03-20 EP EP10002876A patent/EP2200315A1/en not_active Withdrawn
- 2003-03-20 KR KR1020047016204A patent/KR100912984B1/ko active IP Right Grant
- 2003-03-20 JP JP2003585438A patent/JPWO2003088665A1/ja active Pending
- 2003-03-20 WO PCT/JP2003/003450 patent/WO2003088665A1/ja active Application Filing
- 2003-03-20 CN CNB038082608A patent/CN100367794C/zh not_active Expired - Fee Related
- 2003-03-20 KR KR1020107009513A patent/KR100997599B1/ko active IP Right Grant
- 2003-03-20 EP EP10002879A patent/EP2202979A1/en not_active Withdrawn
- 2003-03-31 TW TW092107243A patent/TWI231140B/zh not_active IP Right Cessation
-
2007
- 2007-10-31 US US11/980,648 patent/US8811800B2/en not_active Expired - Fee Related
- 2007-10-31 US US11/980,544 patent/US20080065697A1/en not_active Abandoned
- 2007-10-31 US US11/980,523 patent/US20080071836A1/en not_active Abandoned
- 2007-10-31 US US11/980,514 patent/US20080075431A1/en not_active Abandoned
- 2007-10-31 US US11/980,624 patent/US20080071837A1/en not_active Abandoned
-
2009
- 2009-05-01 JP JP2009111990A patent/JP4652462B2/ja not_active Expired - Fee Related
- 2009-05-01 JP JP2009111989A patent/JP2009171622A/ja active Pending
- 2009-05-01 JP JP2009111991A patent/JP4987907B2/ja not_active Expired - Fee Related
- 2009-05-01 JP JP2009111988A patent/JP2009171621A/ja active Pending
- 2009-09-08 US US12/555,510 patent/US20100005070A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001167099A (ja) * | 1999-12-07 | 2001-06-22 | Nippon Hoso Kyokai <Nhk> | 映像・音声オブジェクトを基本としたデータベース装置 |
JP2001320693A (ja) * | 2000-05-12 | 2001-11-16 | Sony Corp | サービス提供装置および方法、受信端末装置および方法、サービス提供システム |
JP2002041541A (ja) * | 2000-05-19 | 2002-02-08 | Jisedai Joho Hoso System Kenkyusho:Kk | 映像検索装置 |
JP2001357008A (ja) * | 2000-06-14 | 2001-12-26 | Mitsubishi Electric Corp | コンテンツ検索配信装置およびコンテンツ検索配信方法 |
JP2002051287A (ja) * | 2000-08-04 | 2002-02-15 | Sony Corp | 番組録画支援システムおよび番組録画支援方法、並びに、番組視聴サービスシステムおよび番組視聴サービス提供方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1496701A4 * |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007527142A (ja) * | 2003-07-08 | 2007-09-20 | 松下電器産業株式会社 | コンテンツ蓄積システム、ホームサーバ装置、情報提供装置、集積回路、及びプログラム |
JP2005192196A (ja) * | 2003-11-12 | 2005-07-14 | Sony Internatl Europ Gmbh | ビデオ信号のサブセットの定義方法及び自動番組要約装置 |
US8059161B2 (en) | 2004-08-30 | 2011-11-15 | Sony Corporation | Image-information recording device and image-information display device |
JP2006066015A (ja) * | 2004-08-30 | 2006-03-09 | Sony Corp | 画像情報記録装置および画像情報表示装置 |
US9063955B2 (en) | 2004-12-24 | 2015-06-23 | Koninklijke Philips N.V. | Method and apparatus for editing program search information |
JP2008526071A (ja) * | 2004-12-24 | 2008-07-17 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 番組検索情報を編集する方法および装置 |
JP4978894B2 (ja) * | 2005-01-25 | 2012-07-18 | 日本電気株式会社 | 構造化文書検索装置、構造化文書検索方法および構造化文書検索プログラム |
JP2013201763A (ja) * | 2005-07-19 | 2013-10-03 | Apple Inc | メディアデータ送信方法及び装置 |
CN103309933A (zh) * | 2005-07-19 | 2013-09-18 | 苹果公司 | 用于媒体数据传输的方法和设备 |
JP2007041861A (ja) * | 2005-08-03 | 2007-02-15 | Sharp Corp | コンテンツ編集装置、コンピュータ読み取り可能なプログラム及びそれを記録した記録媒体 |
JP4514671B2 (ja) * | 2005-08-03 | 2010-07-28 | シャープ株式会社 | コンテンツ編集装置、コンピュータ読み取り可能なプログラム及びそれを記録した記録媒体 |
JPWO2008041629A1 (ja) * | 2006-09-29 | 2010-02-04 | ソニー株式会社 | 再生装置および方法、情報生成装置および方法、データ格納媒体、データ構造、プログラム格納媒体、並びにプログラム |
JP2008099012A (ja) * | 2006-10-12 | 2008-04-24 | Mitsubishi Electric Corp | コンテンツ再生システム及びコンテンツ蓄積システム |
JP2014197879A (ja) * | 2007-01-05 | 2014-10-16 | ソニック アイピー, インコーポレイテッド | プログレッシブ再生を含む映像分配システム |
US9794318B2 (en) | 2007-01-05 | 2017-10-17 | Sonic Ip, Inc. | Video distribution system including progressive playback |
US10412141B2 (en) | 2007-01-05 | 2019-09-10 | Divx, Llc | Systems and methods for seeking within multimedia content during streaming playback |
US10574716B2 (en) | 2007-01-05 | 2020-02-25 | Divx, Llc | Video distribution system including progressive playback |
US11050808B2 (en) | 2007-01-05 | 2021-06-29 | Divx, Llc | Systems and methods for seeking within multimedia content during streaming playback |
US11706276B2 (en) | 2007-01-05 | 2023-07-18 | Divx, Llc | Systems and methods for seeking within multimedia content during streaming playback |
JP4511569B2 (ja) * | 2007-03-26 | 2010-07-28 | 株式会社エヌ・ティ・ティ・ドコモ | 遠隔視聴システム及び遠隔視聴方法 |
JP2008244656A (ja) * | 2007-03-26 | 2008-10-09 | Ntt Docomo Inc | 遠隔視聴システム及び遠隔視聴方法 |
JP2013051707A (ja) * | 2007-04-17 | 2013-03-14 | Thomson Licensing | データストリームにおけるビデオデータ及び関連するメタデータを送信する方法 |
WO2008136466A1 (ja) * | 2007-05-01 | 2008-11-13 | Dep Co., Ltd. | 動画編集装置 |
JP2009152927A (ja) * | 2007-12-21 | 2009-07-09 | Sony Corp | コンテンツの再生方法および再生システム |
JP2009171480A (ja) * | 2008-01-21 | 2009-07-30 | Hitachi Ltd | 映像記録再生装置及び映像再生装置 |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4987907B2 (ja) | メタデータ処理装置 | |
US7181757B1 (en) | Video summary description scheme and method and system of video summary description data generation for efficient overview and browsing | |
KR100686521B1 (ko) | 비디오 및 메타데이터의 통합을 위한 비디오 멀티미디어응용 파일 형식의 인코딩/디코딩 방법 및 시스템 | |
CA2387404A1 (en) | Video summary description scheme and method and system of video summary description data generation for efficient overview and browsing | |
JP4732418B2 (ja) | メタデータ処理方法 | |
CN101132528A (zh) | 元数据再生、分发、检索装置、元数据再生成条件设定装置 | |
JP4652389B2 (ja) | メタデータ処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 10510548 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020047016204 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20038082608 Country of ref document: CN Ref document number: 2482431 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003712804 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020047016204 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2003712804 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003585438 Country of ref document: JP |