US20070230807A1 - Moving image data processing apparatus and method - Google Patents

Moving image data processing apparatus and method Download PDF

Info

Publication number
US20070230807A1
US20070230807A1 US11/751,107 US75110707A US2007230807A1 US 20070230807 A1 US20070230807 A1 US 20070230807A1 US 75110707 A US75110707 A US 75110707A US 2007230807 A1 US2007230807 A1 US 2007230807A1
Authority
US
United States
Prior art keywords
editing
moving image
image data
segment
managing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/751,107
Other versions
US8644683B2 (en
Inventor
Hirotaka Shiiyama
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2001283960A priority Critical patent/JP3943880B2/en
Priority to JP2001-283960 priority
Priority to US10/242,618 priority patent/US7257311B2/en
Application filed by Canon Inc filed Critical Canon Inc
Priority to US11/751,107 priority patent/US8644683B2/en
Publication of US20070230807A1 publication Critical patent/US20070230807A1/en
Application granted granted Critical
Publication of US8644683B2 publication Critical patent/US8644683B2/en
Application status is Active legal-status Critical
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/835Generation of protective data, e.g. certificates
    • H04N21/8352Generation of protective data, e.g. certificates involving content or source identification data, e.g. Unique Material Identifier [UMID]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2562DVDs [digital versatile discs]; Digital video discs; MMCDs; HDCDs

Abstract

A moving image processing apparatus gives an ID to a video segment obtained by dividing moving image data, and stores information for associating a section of each video segment with the ID as video segment section information. Here, meta-data corresponding to each video segment is managed by associating the meta-data with the ID given to each video segment. In editing operations, editing in units of a video segment is performed, and an arrangement of video segment IDs is manipulated. Therefore, even if the editing is performed, there arises no inconsistency in referring to meta-data information so that it is equivalent to having the meta-data updated in synchronization with the moving image editing. Thus, it is possible to have virtual editing of the moving image automatically followed by an update of the meta-data and also alleviate a burden of reediting the meta-data of an editor.

Description

    FIELD OF THE INVENTION
  • The present invention relates to a moving image data processing apparatus and a method thereof.
  • BACKGROUND OF THE INVENTION
  • As a conventional method of recording moving image data, it has been stored in a tape device such as a magnetic tape. As it is impossible to get random access to the moving image data stored in such a tape device, it is necessary to actually create a moving image stream in video editing. In recent years, however, it became possible to store moving image in a device allowing the random access such as a hard disk, and so virtual editing has become possible by successively specifying moving image sections to be reproduced.
  • Such a virtual editing is very convenient since it allows reediting and so on to be performed easily without losing/altering original information. In addition, in the case where the moving image is stored in a random-accessible device, it is possible to provide a multifunction moving image reproducing apparatus since it allows a moving image search (scene search) with meta-data and a summary reproduction for digestedly reproducing the moving image.
  • When having edited the moving image, it is necessary to edit the meta-data and summary data by following it. For instance, in the case where the contents of moving image editing are not reflected on the meta-data, there is a possibility that a portion not included in the edited moving image is searched when searching and reproducing the edited moving image with the meta-data. To be more specific, consideration must be given to synchronizing the virtual editing of the moving image with an update of the meta-data so that, in the case where a video segment is searched for by using the meta-data, the video segment deleted by the virtual editing will not show in search hit results.
  • In addition, while the contents are checked and grasped by seeing a summary of the moving image in the case where there is a large amount of moving image, there arises an inconsistency between the edited moving image and the summary when the summary is reproduced if the contents of moving image editing are not reflected on the summary data. For instance, in the case where an arrangement of the video segments is changed or the video segment is deleted in the virtual editing, order of the scenes by the summary reproduction is different from the edited moving image or the deleted scenes are reproduced so that it gives a sense of incongruity to a person seeing the moving image.
  • However, it is very burdensome to edit the meta-data and summary data in conjunction with the moving image editing, and there is a tendency that trouble of editing work rather increases even if the editing of the moving image itself becomes easier.
  • SUMMARY OF THE INVENTION
  • The present invention has been achieved in view of the above problems, and an object thereof is to have an adaptation of meta-data search result automatically follow virtual editing of moving image and alleviate a burden of reediting the meta-data of an editor.
  • In addition, another object of the present invention is to render summary result automatically adaptive following the virtual editing of the moving image so as to allow a summary reproduction which does not give a sense of incongruity even after the editing of the moving image.
  • According to the present invention, the foregoing object is attained by providing a moving image data processing apparatus, comprising:
  • group managing means for dividing moving image data into groups comprised of a plurality of frames and giving them IDs to manage each group;
  • editing result storing means for storing an arrangement of the IDs obtained as editing results of the moving image data;
  • moving image reproducing means for reproducing the groups according to the arrangement of the IDs stored by the editing result storing means and thereby reproducing the moving image data as the editing results; and
  • meta-data managing means for managing meta-data corresponding to the groups by associating it with the IDs corresponding to the groups.
  • According to the another aspect of the present invention, the foregoing object is attained by providing a moving image data processing method, comprising:
  • a group managing step of dividing moving image data into groups comprised of a plurality of frames and giving them IDs to manage each group;
  • an editing result storing step of storing an arrangement of the IDs obtained as editing results of the moving image data;
  • moving image reproducing step of reproducing the groups according to the arrangement of the IDs stored by the editing result storing means and thereby reproducing the moving image data as the editing results; and
  • a meta-data managing step of managing meta-data corresponding to the groups by associating it with the IDs corresponding to the groups.
  • Other features and advantages of the present invention will be apparent from the following description taken in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the figures thereof.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
  • FIG. 1 is a block diagram showing a configuration of a moving image processing apparatus according to this embodiment;
  • FIG. 2 is a flowchart showing an operation overview of the moving image processing apparatus according to this embodiment;
  • FIG. 3 is a flowchart describing a generation process of video segment section information according to this embodiment;
  • FIG. 4 is a flowchart describing a generation procedure of meta-data information;
  • FIG. 5 is a flowchart describing the generation procedure of partial reproduction section information according to this embodiment;
  • FIG. 6 is a flowchart describing an editing process of moving image according to this embodiment;
  • FIG. 7 is a flowchart describing a procedure of summary reproduction according to this embodiment;
  • FIG. 8 is a flowchart describing a search process according to this embodiment;
  • FIG. 9 is a drawing showing a concept of division into video segments according to this embodiment;
  • FIG. 10 is a diagram showing an example of a data configuration of the video segment section information according to this embodiment;
  • FIG. 11 is a diagram showing an example of a data configuration of meta-data information according to this embodiment;
  • FIG. 12 is a diagram showing an example of a data configuration of partial reproduction section information for summary reproduction according to this embodiment;
  • FIG. 13 is a diagram showing an example of a data configuration of editing result information according to this embodiment; and
  • FIG. 14 is a diagram showing an example of the partial reproduction section information for the summary reproduction reflecting editing results.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Preferred embodiments of the present invention will now be described in detail in accordance with the accompanying drawings.
  • FIG. 1 is a block diagram showing a configuration of a moving image processing apparatus according to this embodiment. In FIG. 1, reference numeral 101 denotes a CPU, which performs various types of control including the one indicated in a flow chart mentioned later by executing a control program stored in ROM 102 or RAM 103. Reference numeral 102 denotes a ROM, which has the control program executed by the CPU 101 and various types of data stored therein. Reference numeral 103 denotes a RAM, which provides an area for loading the control program stored in an external storage device and a work area for the CPU 101 to perform the various types of control.
  • Reference numeral 104 denotes an indicator, which performs various indications such as moving image reproduction. Reference numeral 105 denotes an operation portion, which is equipped with a keyboard and a mouse. Reference numeral 106 denotes a moving image reproducing apparatus, which reproduces moving image recorded on a DVD for instance. Reference numeral 107 denotes an external storage device, which stores video segment section information 110, meta-data information 111, partial reproduction section information 112, editing result information 113 and summary reproduction information 114 in addition to a control program 115 to be executed by the CPU 101. The information indicated by 110 to 114 is generated and held for each piece of moving image, and details thereof will be described later respectively. Reference numeral 108 denotes a system bus for connecting the above-mentioned configurations.
  • Operation of the moving image processing apparatus according to this embodiment having the above configuration will be described hereafter.
  • FIG. 2 is a flowchart showing an operation overview of the moving image processing apparatus according to this embodiment. First, moving image data is segmented in step S201, and an ID is given to each segment so as to manage it. The process in step S201 will be further described in detail by referring to FIGS. 9 and 10.
  • FIG. 3 is a flowchart describing a generation process of the video segment section information according to this embodiment. FIG. 9 is a drawing showing a concept of the division into the video segments according to this embodiment. In addition, FIG. 10 is a diagram showing an example of data structure of the video segment section information 110 according to this embodiment.
  • In step S201, a series of image frames are divided into video segments comprised of a plurality of image frames as shown in FIG. 9. Moreover, the following can be named as the method of the division into the video segments. (1) To give an instruction from the operation portion 105 to the displayed screen and manually divide them into video segments. (2) To automatically detect scene changes and divide them into video segments according to the detection results. Or (3) To automatically detect the scene changes and divide them into video segments, and manually correct them thereafter. It is possible, as a matter of course, to adopt a method other than these, and so there can be any method of division into the video segments.
  • Next, IDs are sequentially allocated to a plurality of video segments in step S202. And these IDs are used to generate the video segment section information 110 as shown in FIG. 10 for managing the video segments and stores it in the external storage device 107.
  • As shown in FIG. 10, the video segment section information according to this embodiment represents a section of each segment with a start point and an end point of each video segment. Moreover, while section information (start point and end point) is represented by a time code in FIG. 10, a frame number may also be used. To be more specific, it may be any representation method as far as it can represent a video section.
  • Returning to FIG. 2 and finishes the process in step S201 and then proceeds to step S202 to have the meta-data information 111 generated and stored in the external storage device 107. This process will be further described in detail by referring to FIGS. 4 and 11. FIG. 4 is a flowchart describing a generation procedure of meta-data information. In addition, FIG. 11 is a diagram showing an example of a data configuration of the meta-data information according to this embodiment.
  • First, a video segment ID for giving the meta-data is specified in step S401. As for the method of specifying the ID, an ID number may be directly inputted from the operation portion 105, or the video segment ID including the desired scene may be identified by performing a predetermined operation (such as clicking on a moving image reproduction area) at the desired scene in the middle of moving image reproduction.
  • Next, in step S402, the meta-data is manually, automatically or semiautomatically described in a meta-data field corresponding to the specified video segment ID. Moreover, the following examples can be named as the cases of automatic or semiautomatic description. For instance, the date and time can be fully automatically given. In addition, if there is a GPS, the place can also be automatically given, and a building name and so on are manually added in the case of adding them to a rough placename. Furthermore, in the case where proceeding events and a timetable are clear as in a wedding, there is a possibility that the meta-data such as “ceremony,” “wedding party,” and more detailedly, “exchange of rings,” “kiss,” “entrance,” “guest of honor's speech,” “toast” and “cake cutting” may be given to the video segment, even though there may be errors to an extent, by using pattern matching to estimate time information and-time series-related matters. It is called “automatic” give in this embodiment. In addition, it is called “semiautomatic” give in the case where an error made by the automatic give is corrected by a human being or what can be automatically given and what cannot are mixed. And in step S403, it is stored as the meta-data information 111 in the external storage device 107. FIG. 11 shows an example of a meta-data schema. In this example, a table is created for each piece of moving image, and the meta-data described in step S402 is managed for each video segment ID.
  • Returning to FIG. 2, in step S203 next, the partial reproduction section information 112 for summary reproduction is generated and is stored in the external storage device 107. FIG. 5 is a flowchart describing the generation procedure of the partial reproduction section information in step S203. In addition, FIG. 12 is a diagram showing an example of a data configuration of partial reproduction section information for the summary reproduction according to this embodiment. Moving image summary referred to here sequentially reproduces partial moving image. Moreover, while there is one partial reproduction section for summary in each video segment in order to simplify the description in FIG. 12, it does not indicate a limit to the present invention and it is possible to have an arbitrary number of partial reproduction sections for each video segment. To be more specific, the partial moving image to be summary-reproduced which is registered for each video segment may be either one or a plurality.
  • A flow in FIG. 5 shows an example of the process of generating the information for reproducing the moving image summary by one operation. The video segment to be summary-reproduced is selected in step S501, and the start and end points of partial reproduction are specified in step S502. This process may be performed either by manually setting the partial reproduction section while visually checking the moving image actually or by performing some image analysis to automatically extract the partial moving image. As for the automatic extraction method by means of the image analysis, a section of hard action or that of little action may be detected as the partial moving image to be reproduced, for instance. It is possible, as a matter of course, to determine the partial moving image to be reproduced by another technique, whatever technique it may be.
  • In step S503, it is determined whether or not the partial reproduction section for performing further summary reproduction exists in the video segment specified in step S501 (it is determined by whether or not there was an instruction to set another partial reproduction section, for instance), and in the case where it exists, it returns to step S502 and specifies the other partial reproduction section. Thus, it is possible to set a plurality of partial reproduction sections in the same video segment. If instructed to the effect that there is no more partial reproduction section to be specified, it proceeds from step S503 to step S504 to determine whether or not there is another video segment to which the partial reproduction section should be set (it is determined by whether or not another video segment to which the partial reproduction section should be set was specified, for instance). And it returns to step S501 if there is another video segment, or finishes this process if none.
  • The partial reproduction section information 112 as shown in FIG. 12 is generated as above, and is stored in the external storage device 107. The partial reproduction section information 112 is the information for reproducing the moving image summary, and is not changed by virtual editing mentioned later. As shown in FIG. 12, the partial reproduction section information 112 is represented in a form of a list of the partial moving image to be reproduced as the moving image summary for each video segment. In addition, the partial moving image is represented as the start and end points like the video segment section information 110. Moreover, while FIG. 12 represents the section of the partial moving image with the time code, a frame number and so on may also be used, and the form of representation does not matter as far as it can represent the video section just as in the case of the video segment section information 110.
  • Returning to FIG. 2, the video segment section information 110, meta-data information 111 and partial reproduction section information 112 are generated and held in the external storage device 107 by the process of the above steps S201 to S203. Moreover, the generation in the process shown in steps S202 and S203 (generation of the meta-data information 111 and partial reproduction section information 112) may be performed irrespective of timing after forming the video segments (after the process of step S201). In addition, it is also possible to generate the partial reproduction section information 112 first.
  • On finishing the above process, it proceeds to step S204 onward, and the process according to various operations is performed. If there is the operation for starting editing from the operation portion 105, it proceeds from step S204 to step S205 so as to edit the moving image. As described below, the editing process performed here is virtual, and the video segment editing result information 113 is generated and stored as a result of the editing. Hereafter, the editing process in step S205 and the editing result information consequently generated will be described by referring to FIGS. 6 and 13.
  • FIG. 6 is a flowchart describing the editing process of the moving image according to step S205. In addition, FIG. 13 is a diagram showing an example of a data configuration of the editing result information according to this embodiment.
  • First, in step S601, the editing process of the moving image is performed by the operations via the operation portion 105. The editing process performed here is movement and deletion of the video segments. The method of specifying an arrangement of the video segments in the editing operations of this embodiment is, as with an existing editing system, to display representative image frames of the scenes (video segments) as icons and replace order thereof or delete them. It is also possible, however, to implement it by another editing operation method, whatever technique it may be.
  • The editing referred to in this embodiment is not to rearrange the video segments as the original moving image is edited but to rearrange the video segment IDs in edited order and reproduce the video segments in the rearranged order of the video segment IDs so as to implement it. To be more specific, it is the virtual editing. In addition, a deleted video segment is not deleted in reality, but it is just not reproduced.
  • Next, in step S602, the editing result information 113 reflecting on the contents of the editing performed in step S601 is generated, and is stored in the external storage device 107. The editing result information 113 will be described by referring to FIG. 13.
  • Segment arrangement information before performing the editing is sequential as shown in Initial in FIG. 13. As opposed to this, an edited record 1301 (indicates the edited arrangement of the video segments) having the editing operations performed thereto in step S601 and reflecting the editing results in step S602 is generated. This record 1301 has the edited segment arrangement information stored therein, where the ID with “*” is the video segment instructed to be deleted by the editing operation. Accordingly, if the edited video is reproduced in this example, the video segments are reproduced and finished by random access in order of 0→1→3→4. It is also possible, as a matter of course, to have the order of the video segments replaced by the editing such as 0→1→4→3.
  • Moreover, while “*” is used as a method of representing the deleted video segment, any method may be used as far as it can be distinguished from an undeleted video segment. Or a method of separately managing the deleted video segment IDs is also thinkable. In addition, initial segments and information are redundant since they are merely sequential, and so it is also feasible to retain only the edited segments and information.
  • Incidentally, according to this embodiment, even if the above editing operations are performed, only the video segments and information to be stored in the editing result information 113 are updated, so that there is no change in relationship between the segment IDs and the video segments. For this reason, the relationship between the meta-data and the video segments is also assured. Accordingly, even at the moment when the editing is performed and the video segments and information are updated, there arises no inconsistency in referring to the schema (meta-data information 111) shown in FIG. 11 so that it is equivalent to having the meta-data updated in synchronization with the moving image editing.
  • Returning to FIG. 2, if instructed to reproduce the moving image, it proceeds from step S206 to step S207 so as to reproduce the specified moving image. Moreover, when reproducing the moving image, the editing result information (FIG. 13) is referred to, and in the case where the edited record 1301 exists, the video segments are reproduced according to it. In the example of FIG. 13, the video segments are reproduced in order of 0→1→3→4.
  • In addition, if instructed to reproduce the summary, it proceeds from step S208 to step S209 so as to reproduce the summary reflecting on the editing results. At this time, if the moving image editing is performed as mentioned above, the arrangement of the video segments is changed. Therefore, at the time of reproducing the summary, a person seeing it will have a sense of incongruity due to the inconsistency in time series between moving image reproduction time and summary reproduction time unless the partial moving image is reproduced in keeping with the arrangement thereof. In addition, it is not adequate to reproduce as the summary the partial moving image included in the video segments deleted in the editing, and so such partial moving image should not be reproduced when reproducing the summary.
  • The summary reproduction according to this embodiment solves the above problems, and will be described hereafter by referring to FIGS. 7 and 14. FIG. 7 is a flowchart describing the procedure of summary reproduction performed in step S209. FIG. 14 is a diagram showing an example of the partial reproduction section information for the summary reproduction reflecting the editing results.
  • In step S701, the editing result information of an image specified to be summary-reproduced is read. And in step S702, a list of valid video segments keeping the video segment arrangement is created by excluding the video segments to be deleted while keeping the video segment arrangement shown in the record 1301 of the editing result information. And the number of the valid video segments in the list is obtained and is referred to as N. And in step S703, a loop counter I is reset at 0 to obtain the video segment ID at the head of the list generated in step S702.
  • In step S704, it is determined whether or not it is I<N, that is, whether or not the loop counter I exceeded the number N of the valid video segments of the list, and the process is branched to step S705 while not exceeding it, or to step S708 if exceeded it.
  • In step S705, the partial reproduction section information 112 (FIG. 12) is referred to obtain the partial reproduction section information for the summary of the video segments corresponding to the video segment IDs to be processed currently. Moreover, in the case where one video segment ID has a plurality of pieces of partial reproduction section information, all of them are obtained.
  • Subsequently in step S706, the partial reproduction section information obtained in step S705 is added to the partial reproduction section information for the summary. At this time, the newly obtained partial reproduction section information is added after the information added in the previous process so as to keep the order of the list (that is, the video segment order of the editing result information) generated in step S702.
  • In a subsequent step S707, the loop counter I is incremented by I=I+1, and it returns to step S704. Thus, the process of steps S705 and S706 is repeated until the loop counter becomes I<N so as to generate the partial reproduction section information 114 for the summary.
  • Thus, as an example thereof is shown in FIG. 14, the partial reproduction section information 114 for the summary retains the arrangement of the video segments represented in the record 1301 of the editing result information in FIG. 13.
  • If I>N is confirmed in step S704, it proceeds to step S708 to complete the partial reproduction section information 114 for the summary and stores it in the external storage device 107. And in step S709, the partial moving image is sequentially reproduced to reproduce the summary by referring to the partial reproduction section information 114 stored in step S708.
  • According to the above process, the corresponding partial moving image is reproduced according to reproduction order of the edited video segments so that it allows the summary reproduction to be automatically corresponding to the editing results.
  • Returning to FIG. 2 again, if instructed to search the moving image by the operation portion 105, it proceeds from step S210 to step S211 so as to perform a search by using the meta-data. Even in the case of searching the video segments by using the meta-data, it is checked by using the editing result information as to whether the video segment meeting a search condition in a meta-data search has become the video segment deleted by the editing so as not to show the deleted video segment in the search hit results.
  • FIG. 8 is a flowchart describing a search process in step S211. In this process, the video segments deleted by the editing are excluded from the search results.
  • First, in step S801, a meta-data attribute to be searched for and a search query are specified. In this embodiment, as shown in FIG. 11, the meta-data includes a plurality of attributes (“object,” “annotation” and so on), and they have their values recorded respectively. In step S801, the search query is specified, and the attribute of the meta-data for examining whether or not the data matching with the search query exists is specified.
  • In step S802, the meta-data of the meta-data attribute specified in step S801 is examined, and a list of the video segment IDs matching with the search query is generated. In step S803, the video segments and arrangement (record 1301) after editing the editing result information are referred to, and the video segment IDs with “*” mark (IDs of the video segments deleted by the editing) are excluded from the list generated in step S802. In the example in FIG. 13, IDs=2, 5 is are deleted, and so they are deleted in the case where they exist in the list generated in step S802.
  • For instance, in the case where the “object” is selected as the meta-data attribute and the video segment showing a hand is searched for, it hits the video segment IDs=1 and 2 from the meta-data information in FIG. 11. However, the video segment ID=2 is deleted from the editing result information 113 so that only the video segment ID=1 is returned as the search results.
  • The above process can prevent the data deleted by the editing from being included in the search results.
  • In addition, it is also feasible, other than excluding the video segments deleted as a result of the editing from the search results, to reflect the arrangement of the video segment IDs obtained as the editing results when presenting the search results (presenting the search results in order of the arrangement of the video segment IDs, for instance).
  • In addition, while the above embodiment used a conceptual diagram for storing the video segment section information 110, meta-data information 111, partial reproduction section information 112 and editing result information 113 in fixed fields, it is not limited thereto. For instance, as the attribute can be represented by using an identifier (TAG) in the case of using a structure description language such as XML, HTML or SGML, size and description positions are no longer limited. Moreover, the series of information such as 110, 111, 112 and 113 can be represented by using another structure description language.
  • In addition, while a unit of the editing is the video segment in this embodiment, it may also be a shot or the unit based on understanding of the contents.
  • In addition, while the search was performed by covering the entire field to be searched when searching the video segment by using the meta-data according to this embodiment, it is also feasible to prepare an index for an efficient search in advance so as to efficiently obtain the applicable video segment ID by referring to it. For instance, it is assumed that there is a meta-data field for describing a person shown in a video. In the case where three persons A, B and C appear in the video, the index means preparing the list of IDs of the video segments in which A, B and C appear respectively. In the case where there is an instruction to search for the video segments in which A appears, it is very fast since the above list of IDs of the video segments in which A appears can be used as-is as the search results. On the other hand, in the case where there is no index, it takes processing time since the search is performed by covering the entire fields describing the persons appearing in the video.
  • Moreover, it is needless to mention that the object of the present invention is also attained by supplying to a system or an apparatus a storage medium having recorded a program code of software for implementing functions of the aforementioned embodiment and having the program code stored in the storage medium read and executed by a computer (or a CPU or an MPU) of the system or apparatus.
  • In this case, the program code read from the storage medium itself implements the functions of the aforementioned embodiment, and so the storage medium storing the program code constitutes the present invention.
  • As for the storage medium for supplying the program code, a floppy disk, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a CD-R, a magnetic tape, a nonvolatile memory card, a ROM and so on may be used.
  • In addition, it is needless to mention that it includes not only the cases where execution of the program code read by the computer implements the functions of the aforementioned embodiment but also the cases where an OS (operating system) and so on operating on the computer perform a part or all of the actual process based on an instruction of the program code and the functions of the aforementioned embodiment is thereby implemented.
  • Furthermore, it is needless to mention that it includes the cases where the program code read from the storage medium is written to a memory provided to a feature expansion board inserted into the computer or a feature expansion unit connected to the computer, and thereafter the CPU and so on provided to the feature expansion board or the feature expansion unit performs a part or all of the actual process based on the instruction of the program code and the functions of the aforementioned embodiment is thereby implemented.
  • As described above, it is possible, according to the present invention, to have the virtual editing of the moving image automatically followed by the adaptation of the meta-data search result, alleviating a burden of reediting the meta-data of an editor.
  • In addition, according to the present invention, the summary result is automatically adapted following the virtual editing of the moving image so as to allow the summary reproduction which does not give the sense of incongruity even after the editing of the moving image.
  • As many apparently widely different embodiments of the present invention can be made without departing from the spirit and scope thereof, it is to be understood that the invention is not limited to the specific embodiments thereof except as defined in the claims.

Claims (21)

1-16. (canceled)
17. A moving image data processing apparatus, comprising:
a segment managing unit adapted to manage moving image data that have been divided into segments comprised of a plurality of frames and to manage each segment to which an ID is assigned;
a frame group portion management unit adapted to manage a frame group portion including a plurality of continuous frames extracted from a segment belonging to the plurality of segments, by associating the frame group portion with the ID corresponding to the segment of an extraction source thereof;
an editing result storing unit adapted to store an arrangement of the IDs obtained as editing results of the moving image data; and
a summary reproduction unit adapted to sequentially reproduce the frame group portion managed by said frame group portion management unit based on contents stored in said editing result storing unit.
18. The apparatus according to claim 17, wherein said summary reproduction unit sequentially reproduces frame group portions managed by said frame group portion management unit according to arrangement of the IDs stored in said editing result storing unit.
19. The apparatus according to claim 17, wherein said editing result storing unit further stores deletion information indicating an ID of a segment deleted as a result of editing, and
wherein said summary reproduction unit does not reproduce a frame group portion associated with an ID indicated by the deletion information.
20. A moving image data processing apparatus, comprising:
a segment managing unit adapted to manage moving image data that have been divided into segments comprised of a plurality of frames and to manage each segment to which an ID is assigned; and
an editing result storing unit adapted to store an arrangement of the IDs obtained as editing results of the moving image data,
wherein said editing result storing unit further stores deletion information for showing the ID of a segment deleted as a result of editing.
21. A moving image data processing apparatus, comprising:
a segment managing unit adapted to manage moving image data that have been divided into segments comprised of a plurality of frames and to manage each segment to which an ID is assigned; and
an editing result storing unit adapted to store an arrangement of the IDs obtained as editing results of the moving image data;
a meta-data managing unit adapted to manage meta-data corresponding to each of the plurality of segments by associating it with the IDs corresponding to each of the plurality of segments;
a search unit adapted to search for an image by using the meta-data; and
a reflecting unit adapted to reflect the contents stored by said editing result storing unit on search results of said search unit.
22. The apparatus according to claim 21, wherein said reflecting unit presents the search results according to the arrangement of the IDs after the editing stored by said editing result storing unit.
23. The apparatus according to claim 21, wherein the editing results stored by said editing result storing unit include deletion information for showing the ID deleted by editing, and
wherein said reflecting unit deletes search results corresponding to the ID shown in the deletion information from the search results.
24. A moving image data processing method, comprising:
a segment managing step of managing moving image data that have been divided into segments comprised of a plurality of frames and managing each segment to which an ID is assigned;
a frame group portion management step of managing a frame group portion including a plurality of continuous frames extracted from a segment belonging to the plurality of segments, by associating the frame group portion with the ID corresponding to the segment of an extraction source thereof;
an editing result storing step of storing, in a storage unit, an arrangement of the IDs obtained as editing results of the moving image data; and
a summary reproduction step of sequentially reproducing the frame group portion managed in said frame group portion management step based on contents stored in the storage unit.
25. The method according to claim 24, wherein said summary reproduction step sequentially reproduces frame group portions managed by said frame group portion management step according to arrangement of the IDs stored in the storage unit.
26. The method according to claim 24, wherein said editing result storing step further stores, in the storage unit, deletion information indicating an ID of a segment deleted as a result of editing, and
wherein a frame group portion associated with an ID indicated by the deletion information is not reproduced in said summary reproduction step.
27. A moving image data processing method, comprising:
a segment managing step of managing moving image data that have been divided into segments comprised of a plurality of frames and managing each segment to which an ID is assigned; and
an editing result storing step of storing, in a storage unit, an arrangement of the IDs obtained as editing results of the moving image data,
wherein said editing result storing step further stores, in the storage unit, deletion information for showing the ID of a segment deleted as a result of editing.
28. A moving image data processing method, comprising:
a segment managing step of managing moving image data that have been divided into segments comprised of a plurality of frames and managing each segment to which an ID is assigned; and
an editing result storing step of storing, in a storage unit, an arrangement of the IDs obtained as editing results of the moving image data;
a meta-data managing step of managing meta-data corresponding to each of the plurality of segments by associating it with the IDs corresponding to each of the plurality of segments;
a search step of searching for an image by using the meta-data; and
a reflecting step of reflecting the contents stored in the storage unit in said editing result storing step on search results of said search step.
29. The method according to claim 28, wherein said reflecting step presents the search results according to the arrangement of the IDs after the editing stored in the storage unit by said editing result storing step.
30. The method according to claim 28, wherein the editing results stored in the storage unit by said editing result storing step include deletion information for showing the ID deleted by editing, and
wherein said reflecting step deletes search results corresponding to the ID shown in the deletion information from the search results.
31. A storage medium for storing a control program for causing a computer to execute a moving image data processing method according to claim 24.
32. A storage medium for storing a control program for causing a computer to execute a moving image data processing method according to claim 27.
33. A storage medium for storing a control program for causing a computer to execute a moving image data processing method according to claim 28.
34. A moving image data processing apparatus, comprising:
segment managing means for managing moving image data that have been divided into segments comprised of a plurality of frames and for managing each segment to which an ID is assigned;
frame group portion management means for managing a frame group portion including a plurality of continuous frames extracted from a segment belonging to the plurality of segments, by associating the frame group portion with the ID corresponding to the segment of an extraction source thereof;
editing result storing means for storing an arrangement of the IDs obtained as editing results of the moving image data; and
summary reproduction means for sequentially reproducing the frame group portion managed by said frame group portion management means based on contents stored in said editing result storing means.
35. A moving image data processing apparatus, comprising:
segment managing means for managing moving image data that have been divided into segments comprised of a plurality of frames and for managing each segment to which an ID is assigned; and
editing result storing means for storing an arrangement of the IDs obtained as editing results of the moving image data,
wherein said editing result storing means further stores deletion information for showing the ID of a segment deleted as a result of editing.
36. A moving image data processing apparatus, comprising:
segment managing means for managing moving image data that have been divided into segments comprised of a plurality of frames and for managing each segment to which an ID is assigned; and
editing result storing means for storing an arrangement of the IDs obtained as editing results of the moving image data;
meta-data managing means for managing meta-data corresponding to each of the plurality of segments by associating it with the IDs corresponding to each of the plurality of segments;
search means for searching for an image by using the meta-data; and
reflecting means for reflecting the contents stored by said editing result storing means on search results of said search means.
US11/751,107 2001-09-18 2007-05-21 Moving image data processing apparatus and method Active 2026-10-06 US8644683B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2001283960A JP3943880B2 (en) 2001-09-18 2001-09-18 Moving picture data processing apparatus and method
JP2001-283960 2001-09-18
US10/242,618 US7257311B2 (en) 2001-09-18 2002-09-13 Moving image data processing apparatus and method
US11/751,107 US8644683B2 (en) 2001-09-18 2007-05-21 Moving image data processing apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/751,107 US8644683B2 (en) 2001-09-18 2007-05-21 Moving image data processing apparatus and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/242,618 Division US7257311B2 (en) 2001-09-18 2002-09-13 Moving image data processing apparatus and method

Publications (2)

Publication Number Publication Date
US20070230807A1 true US20070230807A1 (en) 2007-10-04
US8644683B2 US8644683B2 (en) 2014-02-04

Family

ID=19107368

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/242,618 Expired - Fee Related US7257311B2 (en) 2001-09-18 2002-09-13 Moving image data processing apparatus and method
US11/751,107 Active 2026-10-06 US8644683B2 (en) 2001-09-18 2007-05-21 Moving image data processing apparatus and method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/242,618 Expired - Fee Related US7257311B2 (en) 2001-09-18 2002-09-13 Moving image data processing apparatus and method

Country Status (2)

Country Link
US (2) US7257311B2 (en)
JP (1) JP3943880B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060271855A1 (en) * 2005-05-27 2006-11-30 Microsoft Corporation Operating system shell management of video files
US20070292106A1 (en) * 2006-06-15 2007-12-20 Microsoft Corporation Audio/visual editing tool
US20080120550A1 (en) * 2006-11-17 2008-05-22 Microsoft Corporation Example based video editing
US8639086B2 (en) 2009-01-06 2014-01-28 Adobe Systems Incorporated Rendering of video based on overlaying of bitmapped images
US8650489B1 (en) * 2007-04-20 2014-02-11 Adobe Systems Incorporated Event processing in a content editor
US9445016B2 (en) 2004-10-29 2016-09-13 Microsoft Technology Licensing, Llc Features such as titles, transitions, and/or effects which vary according to positions

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6752505B2 (en) * 1999-02-23 2004-06-22 Solid State Opto Limited Light redirecting films and film systems
JP3943880B2 (en) * 2001-09-18 2007-07-11 キヤノン株式会社 Moving picture data processing apparatus and method
JP4335492B2 (en) * 2002-03-05 2009-09-30 キヤノン株式会社 Moving image management method and apparatus
US7286749B2 (en) * 2002-04-16 2007-10-23 Canon Kabushiki Kaisha Moving image playback apparatus, moving image playback method, and computer program thereof with determining of first voice period which represents a human utterance period and second voice period other than the first voice period
JP3677779B2 (en) * 2003-04-04 2005-08-03 ソニー株式会社 The information processing apparatus and method, program, and recording medium
JP2005004866A (en) * 2003-06-11 2005-01-06 Matsushita Electric Ind Co Ltd Device and method for processing information, recording medium, and program
EP1494238B1 (en) * 2003-07-01 2009-12-23 Thomson Licensing Method and apparatus for editing a data stream
EP1494237A1 (en) * 2003-07-01 2005-01-05 Deutsche Thomson-Brandt Gmbh Method and apparatus for editing a data stream
JP4412159B2 (en) * 2004-01-29 2010-02-10 セイコーエプソン株式会社 Image processing apparatus, control method of the printer and printer
US8667401B1 (en) * 2004-05-26 2014-03-04 Adobe Systems Incorporated System and method for archiving collaborative electronic meetings
EP1784011A4 (en) * 2004-08-10 2011-09-28 Sony Corp Information signal processing method, information signal processing device, and computer program recording medium
JP2006099671A (en) * 2004-09-30 2006-04-13 Toshiba Corp Search table of meta data of moving image
JP4251131B2 (en) * 2004-11-17 2009-04-08 ソニー株式会社 Data processing apparatus and method
KR100739770B1 (en) * 2004-12-11 2007-07-13 삼성전자주식회사 Storage medium including meta data capable of applying to multi-angle title and apparatus and method thereof
FR2891081B1 (en) * 2005-09-21 2009-05-29 Little Worlds Studio Soc Respo Method and device elaboration of a DVD-video; program, recording medium and instantiating module for this process
KR101271393B1 (en) 2005-11-07 2013-06-05 코닌클리케 필립스 일렉트로닉스 엔.브이. Method and apparatus for editing a program on an optical disc
US20070204238A1 (en) * 2006-02-27 2007-08-30 Microsoft Corporation Smart Video Presentation
JP2007281680A (en) * 2006-04-04 2007-10-25 Sony Corp Image processor and image display method
US20080065693A1 (en) * 2006-09-11 2008-03-13 Bellsouth Intellectual Property Corporation Presenting and linking segments of tagged media files in a media services network
JP4918836B2 (en) * 2006-09-29 2012-04-18 富士ゼロックス株式会社 Dynamic information processing apparatus and information processing program
KR101445074B1 (en) * 2007-10-24 2014-09-29 삼성전자주식회사 Method and apparatus for manipulating media object in media player
WO2009062252A1 (en) * 2007-11-15 2009-05-22 Netcat.Biz Pty Limited System and method for transforming documents for publishing electronically
KR101700811B1 (en) * 2010-09-02 2017-02-01 주식회사 케이티 Method and server for providing contents continuous play service based on locations of user's mobile devices
CA2878735C (en) * 2012-07-10 2018-08-14 Sharp Kabushiki Kaisha Playback device, playback method, distribution device, distribution method, distribution program, playback program, recording medium, and metadata

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6192183B1 (en) * 1996-05-30 2001-02-20 Nippon Telegraph And Telephone Corporation Video editing scheme using icons directly obtained from coded video data
US6192783B1 (en) * 1998-08-28 2001-02-27 Jidosha Kiki Co., Ltd. Brake booster
US6289166B1 (en) * 1998-01-21 2001-09-11 Kabushiki Kaisha Toshiba Video data recording medium, video data recording apparatus and video data playback apparatus
US20020003881A1 (en) * 1998-08-20 2002-01-10 Glenn Arthur Reitmeier Secure information distribution system utilizing information segment scrambling
US6370316B1 (en) * 1999-07-13 2002-04-09 Matsushita Electric Industrial Co., Ltd. Apparatus for retrieving and administrating moving pictures and related network system
US6462754B1 (en) * 1999-02-22 2002-10-08 Siemens Corporate Research, Inc. Method and apparatus for authoring and linking video documents
US20060253780A1 (en) * 1998-12-25 2006-11-09 Mastushita Electric Industrial Co., Ltd. Data processing device and method for selecting media segments on the basis of a score
US7257311B2 (en) * 2001-09-18 2007-08-14 Canon Kabushiki Kaisha Moving image data processing apparatus and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11238071A (en) 1998-02-20 1999-08-31 Toshiba Corp Device and method for digest generation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6192183B1 (en) * 1996-05-30 2001-02-20 Nippon Telegraph And Telephone Corporation Video editing scheme using icons directly obtained from coded video data
US6289166B1 (en) * 1998-01-21 2001-09-11 Kabushiki Kaisha Toshiba Video data recording medium, video data recording apparatus and video data playback apparatus
US20020003881A1 (en) * 1998-08-20 2002-01-10 Glenn Arthur Reitmeier Secure information distribution system utilizing information segment scrambling
US6192783B1 (en) * 1998-08-28 2001-02-27 Jidosha Kiki Co., Ltd. Brake booster
US20060253780A1 (en) * 1998-12-25 2006-11-09 Mastushita Electric Industrial Co., Ltd. Data processing device and method for selecting media segments on the basis of a score
US6462754B1 (en) * 1999-02-22 2002-10-08 Siemens Corporate Research, Inc. Method and apparatus for authoring and linking video documents
US6370316B1 (en) * 1999-07-13 2002-04-09 Matsushita Electric Industrial Co., Ltd. Apparatus for retrieving and administrating moving pictures and related network system
US7257311B2 (en) * 2001-09-18 2007-08-14 Canon Kabushiki Kaisha Moving image data processing apparatus and method

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9445016B2 (en) 2004-10-29 2016-09-13 Microsoft Technology Licensing, Llc Features such as titles, transitions, and/or effects which vary according to positions
US20060271855A1 (en) * 2005-05-27 2006-11-30 Microsoft Corporation Operating system shell management of video files
US7945142B2 (en) 2006-06-15 2011-05-17 Microsoft Corporation Audio/visual editing tool
US20070292106A1 (en) * 2006-06-15 2007-12-20 Microsoft Corporation Audio/visual editing tool
US20080120550A1 (en) * 2006-11-17 2008-05-22 Microsoft Corporation Example based video editing
US8375302B2 (en) 2006-11-17 2013-02-12 Microsoft Corporation Example based video editing
US9880693B2 (en) 2006-11-17 2018-01-30 Microsoft Technology Licensing, Llc Example based video editing
US8650489B1 (en) * 2007-04-20 2014-02-11 Adobe Systems Incorporated Event processing in a content editor
US8639086B2 (en) 2009-01-06 2014-01-28 Adobe Systems Incorporated Rendering of video based on overlaying of bitmapped images

Also Published As

Publication number Publication date
JP2003092723A (en) 2003-03-28
JP3943880B2 (en) 2007-07-11
US8644683B2 (en) 2014-02-04
US20030052910A1 (en) 2003-03-20
US7257311B2 (en) 2007-08-14

Similar Documents

Publication Publication Date Title
US6301592B1 (en) Method of and an apparatus for displaying version information and configuration information and a computer-readable recording medium on which a version and configuration information display program is recorded
US5428774A (en) System of updating an index file of frame sequences so that it indexes non-overlapping motion image frame sequences
AU650179B2 (en) A compositer interface for arranging the components of special effects for a motion picture production
US5898430A (en) Scenario editor for multimedia data and scenario reproducing apparatus
US5568275A (en) Method for visually and audibly representing computer instructions for editing
US8271962B2 (en) Scripted interactive screen media
KR101365360B1 (en) Information processing apparatus and method, and recording medium for program
US8819535B2 (en) Editing time-based media with enhanced content
US5861880A (en) Editing system for multi-media documents with parallel and sequential data
EP2132624B1 (en) Automatically generating audiovisual works
US5574845A (en) Method and apparatus video data management
US5728962A (en) Rearranging artistic compositions
US6813622B2 (en) Media storage and retrieval system
US5819103A (en) Information recording/reproducing apparatus and method
US6161115A (en) Media editing system with improved effect management
JP2713389B2 (en) How to manage multi-media document format
US7930418B2 (en) Collaborative computer-based production system including annotation, versioning and remote interaction
US10324605B2 (en) Media-editing application with novel editing tools
US5999173A (en) Method and apparatus for video editing with video clip representations displayed along a time line
US20050132293A1 (en) System and method of multimedia content editing
US6378132B1 (en) Signal capture and distribution system
US20130124999A1 (en) Reference clips in a media-editing application
KR100602315B1 (en) Moving image data management apparatus and method
US20040123231A1 (en) System and method for annotating multi-modal characteristics in multimedia documents
EP0978994A2 (en) Storage medium, recording apparatus, playback apparatus, recording method, and computer-readable storage medium

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4