WO2010073695A1 - 編集情報提示装置、編集情報提示方法、プログラム、及び記録媒体 - Google Patents
編集情報提示装置、編集情報提示方法、プログラム、及び記録媒体 Download PDFInfo
- Publication number
- WO2010073695A1 WO2010073695A1 PCT/JP2009/007241 JP2009007241W WO2010073695A1 WO 2010073695 A1 WO2010073695 A1 WO 2010073695A1 JP 2009007241 W JP2009007241 W JP 2009007241W WO 2010073695 A1 WO2010073695 A1 WO 2010073695A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- editing
- section
- information
- content
- edit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 54
- 238000003860 storage Methods 0.000 title description 4
- 230000008569 process Effects 0.000 claims description 32
- 230000006870 function Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 8
- 238000012937 correction Methods 0.000 claims description 7
- 239000000284 extract Substances 0.000 claims description 7
- 230000008859 change Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 9
- 230000002776 aggregation Effects 0.000 description 7
- 238000004220 aggregation Methods 0.000 description 7
- 238000012217 deletion Methods 0.000 description 6
- 230000037430 deletion Effects 0.000 description 6
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 230000004931 aggregating effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000012966 insertion method Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/60—Solid state media
- G11B2220/61—Solid state media wherein solid state memory is used for storing A/V content
Definitions
- the present invention relates to an editing information presenting apparatus, an editing information presenting method, an editing information presenting program, and a recording medium, and in particular, an editing information presenting apparatus capable of presenting a correspondence relationship between content before editing and content after editing. About.
- Video editing software Adobe Premier Pro CS3 exists as existing software that can confirm content editing history information (for example, Non-Patent Document 1).
- This software is software for editing video contents, and has a function of managing editing history information called “history” that holds editing actions performed by a user during video editing in time order.
- Patent Document 1 describes a video editing support device.
- this video editing support device candidates for how to arrange shots during video editing are presented to general users who have no video editing technology and knowledge.
- This video editing support device learns the pre-edit video and post-edit video and sound time-series configuration information, synchronization information, and the same scene, illuminates the characteristics and learning results of the material video to be edited, Present candidates.
- learning is performed using the edited video after the editing process and the pre-editing video that is the video before the editing process.
- FIG. 19 is a diagram showing only the learning means part of the video editing support apparatus of Patent Document 1.
- the learning unit 1000 includes a video reception unit 1001, a video signal extraction unit 1002, and a video analysis unit 1003.
- the pre-edit video and post-edit video used for learning are input to the video accepting unit 1001.
- the video signal extraction unit 1002 extracts the video and sound characteristics of the pre-edit video and the post-edit video obtained by the video reception unit 1001.
- the video analysis unit 1003 determines which shot of the pre-edit video is used at which position of the post-edit video by comparing the video and sound characteristics of the pre-edit video and the post-edit video. .
- Patent Document 2 the relationship between a material called a parent clip and a child segment created from this material is indicated by an arrow (or other correspondence expression such as color), that is, between the parent clip and the child segment.
- An arrow or other correspondence expression such as color
- the second problem is that the user cannot easily grasp the outline of changes in the entire content before and after editing.
- the reason for this is that in order to show an overview of changes in the entire content before and after editing, it is necessary to calculate editing information that indicates the general structure change of the content before and after editing from the editing history information.
- An edit history information management function cannot extract an overview of changes in the entire content.
- the method of Patent Document 1 has a function of automatically generating content in accordance with rules obtained from learned content, the editing action performed by the user is stored as editing history information, and changes in content are explicitly displayed. Therefore, it is not possible to extract an outline of changes in the entire content.
- Patent Document 2 does not disclose the state of movement of clips (sections) in the content before and after editing.
- the corresponding information is obtained by grouping the parts and pre-editing content and post-editing content. By displaying, the intention of editing could not be understood more clearly.
- the present invention has been made in view of the above-described problem, and can provide an editing information presentation device, an editing information presentation method, and a program capable of presenting a user's intention for editing and grasping the outline of the editing content. And providing a recording medium.
- an editing information presenting apparatus that presents a correspondence relationship before and after editing in an editing process in which content having a time axis is corrected for each section, A part composed of a plurality of sections that are sandwiched between sections that are not modified in the content before editing and a section that is sandwiched between sections that are not modified in the edited content.
- Edit section information acquisition means for associating a plurality of sections corresponding to each other between portions composed of a plurality of sections;
- Presenting means for presenting the correspondence of the plurality of sections as one correspondence;
- An editing information presentation apparatus characterized by comprising:
- the editing information presentation device refers to the relationship between the pre-edit content and post-edit content sections, two types of section information, that is, an edit section and an unedited section, group information in which continuous edit sections are grouped, and sections
- This is an apparatus that extracts and displays editing information that summarizes correspondence information between each other, between groups, and between sections and groups.
- the editing information presentation device displays the summary of the editing content using the group information of the editing information and the correspondence information between the groups, or displays the details of the editing content using the section information and the related information between the sections. Or, it is possible to change the outline and details of the editing content by inputting to the editing information presentation device. In order to grasp the outline of the edited content, it is important to display the correspondence information between groups.
- the correspondence information between the groups is a summary of the correspondence relations of editing sections between contents before and after editing. Therefore, if the editing section between the contents before and after editing belongs to the same group in each of the contents before and after editing, the editing section exists in the associated group by knowing the association information between the groups. Then, the outline of the editing contents that some kind of editing has been performed can be understood. In other words, by replacing the correspondence information between the editing sections between the pre-edit content and the post-edit content with the correspondence information between the groups, the correspondence relationship between the content before and after editing can be expressed succinctly. it can. Thereby, it is possible to grasp the outline of the editing content.
- the content used here refers to content having time information such as moving image data and sound data, and does not include content having no time information such as text and still images.
- an editing information presentation method in an editing information presentation device for presenting a correspondence relationship before and after editing in content for editing processing for correcting content having a time axis for each section, A part composed of a plurality of sections that are sandwiched between sections that are not modified in the content before editing and a section that is sandwiched between sections that are not modified in the edited content.
- Edit section information acquisition step for associating a plurality of sections corresponding to each other between the sections composed of a plurality of sections, A presentation step of presenting the association of the plurality of sections as one correspondence;
- an editing information presentation method characterized by comprising:
- a program that causes a computer to function as the above-described editing information presentation device, or a recording medium that records this program.
- FIG. 3 is a diagram illustrating an example of editing group association information by aggregating associations of editing sections in FIG. 2 in units of groups.
- the result of associating the group ID of the pre-editing content with the section ID of the post-editing content is used. It is the figure which showed the relationship.
- the information of each area as a component This is screen information in which an edit group component is associated with an edit section component.
- FIG. 6 It is a figure which shows an example of the screen information which shows that the continuous edit area component of FIG. 6 can be collected by the edit group component. It is a figure which shows that the screen information using both an edit area component and an edit group component can be produced
- FIG. 10 is a diagram illustrating an example of a flow of extracting shot / sound signal information and shot / sound synchronization information from a pre-edit video and a post-edit video of Patent Document 1.
- FIG. 10 is a diagram illustrating an example of a flow of extracting shot / sound signal information and shot / sound synchronization information from a pre-edit video and a post-edit video of Patent Document 1.
- FIG. 1 is a diagram illustrating an example of a schematic configuration of the editing information presentation device 1 according to the first embodiment.
- an editing information presentation device 1 (an example of an editing information presentation device) includes an editing section information acquisition unit 2, a pre-editing content / post-editing content correspondence acquisition unit 3, and an editing information display screen generation.
- the unit 4, the input unit 5, and the presentation unit 6 are provided.
- the editing section information acquisition unit 2 calculates editing section information based on the editing history information, and in the process of calculating the editing section information, the editing section of the content before editing and the editing section of the edited content Are associated with the editing sections generated by editing the information, and the associated information is calculated as editing section association information.
- the editing section association information uses the content ID and section ID of the editing section information for specifying the editing section, and associates the editing section as a pair, that is, specifies the ID and editing section of the pre-editing content. This is created by associating a combination of IDs to be edited, an ID of the edited content, and a combination of IDs specifying the editing section.
- the editing history information is information that records an editing operation when a certain video is edited.
- the editing section is a section that has been modified by editing in the process of repeating the editing process, the section in which the editing process is applied to the content before editing, and the section on the edited content generated as a result Refers to both.
- the editing section information is information for specifying a section in which editing is performed between the pre-editing content and the post-editing content, and information describing the added editing processing content.
- the pre-editing content / post-editing content correspondence acquisition unit 3 groups consecutive editing sections based on the editing section information calculated by the editing section information acquisition unit 2, and generates group information for identifying the group. .
- group information is information in which, for example, an ID that identifies each of a plurality of sections belonging to the same group and an ID that identifies the group are associated with each other.
- the pre-edit content / post-edit content correspondence acquisition unit 3 is based on the edit interval association information output from the edit interval information acquisition unit 2, and the association between the groups consisting of edit intervals, Perform group association.
- inter-group association information is based on the premise that the group IDs correspond to the same content ID, and the ID for specifying the editing section in the content before editing and the ID for specifying the group in the content after editing. Or an information for associating an ID for identifying a group in the content before editing and an ID for identifying an editing section in the content after editing.
- the section / group association information is information in which, for example, groups having a common editing section are associated with each other by a group ID in the pre-edit content and the edited content.
- the input unit 5 sends the input content from the user to the edit information display screen generation unit 4.
- the editing information display screen generation unit 4 includes the editing section information of the editing section information acquisition unit 2, the editing section association information, the group information of the pre-edit content / inter-contents correspondence acquisition unit 3, the inter-group association information, and the section. Screen information is generated based on six pieces of information, that is, group association information and input information of the input unit 5.
- the presentation unit 6 displays the screen information obtained from the editing information display screen generation unit 4.
- the editing section information acquisition unit 2 acquires editing section information and editing section association information from the editing history information output from the editing history information storage unit 100.
- the editing history information is usually recorded and stored for each section where the editing operation is performed.
- This stored information includes information indicating the content of the editing operation, information for identifying the section to which the editing operation of the content before and after editing is applied (start time (start frame) / end time (end frame), section length (number of frames) ) Etc.).
- the editing history information includes information indicating that color correction has been performed as an editing process, and editing of the content before editing.
- Information indicating that the time when the operation is started is the first second
- information indicating that the time when the editing operation is applied is the third time in the pre-editing content
- the editing operation is included in the post-editing content.
- Information indicating that the start time is the first second
- information indicating that the editing operation is third time in the edited content
- the length of the section where the editing operation is performed is 2 seconds
- Each piece of information indicating that it is present is stored.
- the editing section information includes an ID for identifying the content in which the editing section exists, an ID for identifying the editing section within the content, editing processing content information, a section start / end time, and a section length.
- the edit history information describes the change of the content for each editing process, whereas the edit section information is the final change when comparing the content before editing and the content after editing. Is only described.
- the editing history information of the section A includes the information regarding the movement of the section A and the information regarding the deletion of the section A, which are the editing process contents.
- the edit section information of section A describes only information related to deletion of section A.
- the editing section information acquisition unit 2 calculates editing section information from the editing history information.
- the correspondence between the editing section of the pre-editing content and the editing section of the post-editing content is calculated as editing section association information.
- the editing section association information is calculated using, for example, a position change of the section indicated by the editing history information, a section length change, and the like.
- a first edited content is created by applying a first editing process to the pre-edited content
- a second edited content is created by adding a second editing process to the first edited content.
- the editing history information indicates the association between the pre-editing content and the first edited content, and the association between the first edited content and the second edited content.
- the editing section correlation information can indicate the correlation between the pre-edit content and the second post-edit content using the editing history information. That is, according to the editing section association information, it is possible to associate the section between the pre-editing content and the post-editing content after being edited a plurality of times.
- the editing section association information is created by associating editing sections as a pair using a content ID and section ID of the editing section information specifying the editing section.
- editing sections such as deleted or added sections cannot be associated. For example, since the section that existed before editing is deleted when the section is deleted, there is no edit section that can be associated with the edited content. Further, since the addition of the section adds the editing section, there is no section that can be associated with the content before editing. Some of the editing sections such as these deleted or added sections are associated with an ID that is prepared in advance and indicates a state where there is no section association.
- the correspondence between the pre-edit content and post-edit content acquisition unit 3 groups continuous edit sections and generates group information including the edit sections. Then, the group-to-group association information is generated by associating the editing groups before and after editing, and the section-to-group association information is generated by associating the editing section with the editing group.
- the grouping of editing sections means that the sections that are continuous in time order in the contents of the pre-editing content and the post-editing content are combined into one.
- the pre-edit content / post-edit content correspondence acquisition unit 3 aggregates the edit information by grouping the edit sections.
- an editing group consisting of editing sections.
- the editing group is created by grouping consecutive editing sections into one group based on the section start / end time information of the editing section information obtained from the editing section information acquisition unit 2. On the other hand, if the editing sections are not continuous, an editing group is created by one section.
- a unique group ID in the content is assigned to the created editing group. Further, a group ID of the editing group to which the editing section belongs is assigned to each editing section information.
- association between editing groups before and after editing and association between sections and groups are performed.
- the association between the edit groups is associated with each other before and after editing (association between edit groups).
- association between edit sections and edit groups association between edit sections and edit groups is performed.
- the before-editing content / after-editing content correspondence acquisition unit 3 performs the above-described two types of association, and generates association information.
- the editing groups before and after editing are associated based on group information composed of editing sections and editing section association information obtained from the editing section information acquisition unit 2.
- the association result of the edit group is generated as the association information between the edit groups.
- the inter-edit group association information is information in which edit section association information is aggregated in units of groups.
- FIG. 2 is a diagram showing an example of editing section association information.
- FIG. 3 shows editing group association information by aggregating editing section association information in groups with respect to the example shown in FIG. It is a figure.
- the arrows in FIGS. 2 and 3 indicate the correspondence.
- the crosses in FIG. 2 indicate erased sections, and the crosses in FIG. 3 indicate that there are erased sections in the group.
- the + mark in FIG. 2 indicates an added section
- the + mark in FIG. 3 indicates that there is an added section in the group.
- the association between the editing sections before and after editing shown in FIG. 2 is converted into the association between the groups to which the section shown in FIG.
- the section ID and group ID of the editing section information are used to convert the section association into the group association.
- a section ID associated with each group ID of the pre-edit content is obtained from the group information including the editing section.
- the section ID of the edited content paired with the section ID is obtained from the editing section association information.
- the group ID to which each section ID of the obtained edited content belongs is obtained.
- the edit groups can be associated with each other.
- the sections corresponding to the pre-edit content group A1 are the sections A1 and B1.
- the sections of the edited content corresponding to the sections A1 and B1 of the content before editing are the sections A2 and B2.
- the section A2 and the section B2 belong to the edited content group A2.
- the association from the group A1 to the group A2 is obtained.
- the result of association is generated as association information between groups.
- the correspondence between the same groups is generated from the correspondence of each editing section, the correspondence between the groups is combined into one.
- the section A1 and the section B1 of the pre-edit content shown in FIG. 2 belong to the group A1 shown in FIG.
- the section A2 and the section B2 belong to the group A2 shown in FIG. Section A1 and section A2, and section B1 and section B2 are associated with each other.
- the associations between the group A1 and the group A2 are combined into one correspondence relationship. That is, the information that the section A1 corresponds to the section A2 and the information that the section B1 corresponds to the section B2 are one association that the groups A1 and A2 before and after editing correspond to each other. It is put together in information.
- section and group are associated with the pre-edit content and post-edit content. That is, the editing group and the editing section are associated with each other between the pre-editing content and the edited content using the editing section association information and the group information including the editing section.
- section ID information and group ID information are acquired from the group information. And the section ID information used as a pair is acquired from edit area correlation information. Then, by associating the section ID obtained here with the section ID of the group information, it is possible to associate the editing section and the group information between the pre-editing content and the edited content.
- the group information consisting of the editing sections that the group A1 of the pre-editing content consists of the sections A1 and B1. Further, it can be seen from the editing section association information that the section A1 is associated with the section A2, and the section B1 is associated with the section B2. As a result, the association with the group A1 is obtained from the section A2 and the section B2 of the edited content.
- the group A2 of the edited content is composed of the sections A2 and B2.
- the section A2 is associated with the section A1
- the section B2 is associated with the section B1.
- the association with the group A2 is obtained from the section A1 and the section B1 of the pre-edit content.
- the section of the pre-edit content and the group of the edited content, and the group of the pre-edit content and the section of the post-edit content are associated.
- FIG. 4 shows the result of associating the section ID of the pre-edit content with the group ID of the edited content when the sections are associated with each other as shown in FIG. It is the figure which showed the result which matched group ID of content.
- the editing information display screen generation unit 4 includes editing section information output from the editing section information acquisition unit 2, editing section association information, group information output from the pre-editing content / post-editing content correspondence acquisition unit 3, group Screen information is generated based on six pieces of information: inter-association information, section / group association information, and input information output from the input unit 5.
- the screen information includes pre-edit content, post-edit content, and associations between sections included in the content and content in the group / content.
- Each content of screen information is expressed by component of section and group.
- the component here is an object that associates each section information and each group information.
- the component of the section is a component other than the editing section component that exists between the editing section component associated with the editing section information and the discontinuous editing section component, that is, the component corresponding to the section not edited. It consists of two types.
- the group component is composed of an edit group component associated with group information including edit sections.
- the presentation unit 6 represents each component with graphics such as text, images, and video.
- the section components are arranged in time order for each content before and after each editing.
- the edit section association information is used for this, the section as shown in FIG. 2 (however, the edit section is interpreted as an edit section component and the unedited section is interpreted as an unedited component) and the section correspondence Screen information using the attached information is generated.
- section information of each editing section component and the section ID of the group information of the editing group correspond to each other, as shown in FIG. 6, it is possible to generate screen information that associates the editing group component with the editing section component. it can.
- the screen information that accompanies and displays the editing section component and the editing group component to which the editing section component belongs is expressed only by the editing group component that is an aggregation of the editing section components by selecting the editing group component with the input unit 5.
- Screen information can be generated.
- the aggregation of the editing section components means that the content portion expressed by the editing section components is expressed by the editing group component.
- the screen information expressed by both the editing section component and the editing group component as shown in FIG. 8 can be generated by aggregating a part of the editing group components shown in FIG. .
- the aggregated group component can be expanded into screen information accompanied by the original section component and the group component to which the section component belongs. For example, when one aggregated group C1 which is the group component shown in FIG. 7 is selected via the input unit 5, all sections of the group C1 are displayed as shown in FIG.
- FIG. 3 a part of FIG. 3 is expanded, and the screen information in which the editing section component and the editing group component are mixed and corresponded between the corresponding content before editing and the content after editing as shown in FIG. Can be generated.
- the group association information is used for correspondence between these editing group components. Also, section / group association information is used for the section component / group component correspondence.
- FIG. 9 shows that when one section component H1 is selected by the input unit 5, the group E1 to which the section belongs, the associated section H2, and the group A2 of the associated section are emphasized.
- FIG. 10 shows that when the group component A2 of the edited content is selected by the input unit 5, the section components B2, H2, and A2 of the selected group component, and the associated group components A1 and E1, It is a figure which shows that attached
- the input unit 5 receives an input for selecting a section component or a group component by the user. This input is used to change the screen information of the editing information display screen generation unit 4 such as aggregation / development and emphasis of section components and group components.
- the selection method of aggregation / development and emphasis by the input unit 5 includes a method of using a mouse click or a menu screen.
- the components are double-clicked, and when the components are emphasized, the components are single-clicked.
- the presentation part 6 presents the screen information which the edit information display screen generation part 4 produced
- the overall configuration of the editing information presentation device according to the second embodiment is the same as that of the editing information presentation device 1 shown in FIG. 1 except that the editing section information acquisition unit 2 determines the importance of the editing history. .
- FIG. 11 is a diagram showing the editing section information acquisition unit 2 in the editing information presentation device according to the second embodiment.
- the editing information presentation apparatus includes an editing history importance level determination unit 7 that calculates the importance level of editing history information in the editing history information storage unit 100 as an editing section information acquisition unit 2.
- the editing section information acquisition unit 2 is configured to include the editing history importance degree determination unit 7 and the editing section information generation unit 8.
- Some editing history information is not very useful for grasping changes in the overall composition of content before and after editing because the editing process is too detailed.
- the editing history importance level determination unit 7 realizes a function of hiding such fine editing processing information when displaying the change in content before and after editing as shown in FIG.
- the editing history importance level determination unit 7 calculates the importance level of each editing process stored in the editing history information. Then, the editing history importance level determination unit 7 generates data that associates the editing history with the importance level.
- the importance is, for example, information indicating the degree of change in the editing process, and is used to determine whether the editing history information can be used to grasp the outline of changes in content. The value can be determined by the content of the editing process, the section length to which the editing process is applied, the change in contrast, the presence or absence of the section deletion, and the like.
- the degree of importance is set low for a slight change in the contrast of the video content or the cut of minute noise in the audio content.
- the length of the section to which the editing process is applied has a greater effect on the content as the length of the section of the editing process is longer. Therefore, the importance level is set higher, and a shorter editing section of less than 1 second affects the content. Since is small, the importance is set low.
- the importance setting method determined by the content of the editing process and the section length of the editing process can be changed according to the user's preference, such as setting the importance high even in a short editing section.
- the editing section information generation unit 8 calculates editing section information / editing section association information from the editing history information and the data obtained by associating the history and the importance obtained from the editing history importance determination section 7. Then, the editing section information generation unit 8 determines that the editing section extracted from the editing history whose importance is equal to or less than a predetermined threshold is not so meaningful as the editing content of the content. Treat as not generated. Then, instead of the extracted editing section, sub-section information that is information of a section whose importance is equal to or less than a threshold is generated.
- the sub-section information includes information indicating an editing section whose importance is equal to or less than a certain threshold, and includes a start / end time, a section length, and a sub-section ID for identifying itself.
- editing section information is generated in the editing section information generation unit 8
- a sub section exists in a certain editing section, it is associated with the editing section information.
- the sub-section does not exist in any editing section, the sub-section is converted into editing section information and held together with the already generated editing section information.
- a sub-section not included in the editing section exists in a section that is not edited. Therefore, in order not to include a sub-section in a section that is not edited, the sub-section is treated as an editing section, and the sub-section information is converted into editing section information.
- FIG. 12 is a diagram showing an example of merging editing sections extracted from an editing history with low importance.
- FIG. 12 shows a process of calculating content editing section information from editing history information extracted from an editing history with low importance.
- sections are cut and rearranged (b) and section deleted (b). This section deletion (b) is applied to several frames, and it is assumed that the importance obtained by the editing history importance determination unit 7 is low.
- section A is cut and rearranged.
- section I in section D is deleted. Since the importance of the section I is equal to or less than a certain threshold value, the section I is registered as a sub-section, and the section I is treated as the section D and the section I is not generated (erased).
- the section H in which the section I is erased and the section J are merged to form a section D2.
- FIG. 13 is a diagram illustrating an example of screen information of the editing information display screen generation unit 4 using sub-section information.
- FIG. 13 in addition to the screen information as shown in FIG. 6, information indicating a section having sub-section information can be added. Further, the picture of the magnifying glass shown in FIG. 13 indicates that there is sub-section information in the section.
- the edit information display screen generation unit 4 can generate screen information for displaying a sub-section as shown in FIG. 14 by selecting an interval component with a magnifying glass from the input unit 5.
- an edit section with fine editing processing is re-recognized as a sub-section, and the content of the content before and after editing is hidden by hiding the sub-section when presenting the change in content before and after editing. It is possible to make it easy to grasp an outline of changes.
- the overall configuration of the editing information presentation device is the same as that of the editing information presentation device 1 shown in FIG. 1, but the editing information display screen generation unit 4 has a relationship between the content before editing and the content after editing.
- the components in the editing section or the section not edited are the same, it is different in that it is arranged so that the user can easily confirm the content change by editing by aligning in the direction perpendicular to the time axis.
- FIG. 15 shows an editing information display screen generation unit 4 in the editing information apparatus according to the third embodiment.
- a layout setting unit 11 is added to the editing information display screen generation unit 4 shown in FIG. FIG.
- the edit information display screen generation unit 4 includes a component generation / association unit 9, a component / component association selection unit 10, and a layout setting unit 11.
- the component generation / association unit 9 generates three components: an unedited section component, an editing section component, and an editing group component. As already described, this component indicates an object associated with each section information and each group information.
- the component / component association selection unit 10 aggregates the section components from the input received from the input unit 5 as shown in FIGS. 8 to 10, and emphasizes the screen information obtained by expanding the group components and the correspondence between the components and the components. Generated screen information.
- the layout setting unit 11 aligns the same components of the pre-edit content and the post-edit content vertically with the time axis by using the fact that the unedited section components are arranged in the content before and after editing and the order and section length do not change.
- the unedited section components corresponding to the pre-edit content and the post-edit content are aligned in the direction perpendicular to the time axis.
- each set when there are the same editing section components between the pre-editing content and the post-editing content, those editing components are aligned in a direction that intersects vertically.
- those editing components are aligned in a direction that intersects vertically.
- the editing section components are not arranged in the same row, but a blank is inserted in the direction perpendicular to the editing section component to adjust the layout.
- this blank is inserted in the content after editing if the editing section component belongs to the content before editing, and is inserted in the content before editing if it belongs to the content after editing.
- the edit group component does not have the same corresponding group component because the contents of the group are edited with the content before and after editing. Therefore, it is necessary to insert a space.
- the blank insertion method is the same as that for the editing section component. If an unedited editing group component belongs to the pre-editing content, it is inserted into the post-editing content. Insert into.
- FIG. 16 is a diagram showing an example of the change in the component layout by the layout setting unit 11.
- the corresponding non-edited section components of the content before and after editing are aligned in the vertical direction on the time axis.
- the editing section component D1 corresponds to the editing section component D2 between the components A2 and B2 in which the edited content is not edited. Since D1 and D2 exist between the unedited section components A1 and A2 that are aligned vertically with the time axis, and B1 and B2, D1 and D2 can be arranged with the time axes aligned in the vertical direction. it can.
- the other editing section components and editing group components have no corresponding components, so that the blank and the time axis are aligned in the vertical direction.
- the editing information presentation device has editing section information having a function of extracting the editing section described in the first embodiment in the editing section information acquisition unit 2 shown in FIG. It is the structure which added the production
- the unedited section refers to a section in which editing processing existing in the content before editing and the content after editing is not added.
- the unedited section information is information for specifying a section where editing has not been performed between the content before editing and the content after editing. It consists of an ID for identifying content in which an unedited section exists, an ID for identifying an unedited section in the content, a section start / end time, and a section length.
- the editing section information is extracted from the editing section information generation unit 8, and the unedited section information is extracted from the unedited section information generation unit 12 based on the editing section information obtained from the editing section information generation unit 8.
- the unedited section can be extracted by removing the edited section from the entire content. That is, all editing sections are mapped on the timeline of the entire content, and sections that are not edited are continuous on the time axis, and are determined as unedited sections. Then, unedited section information of the unedited section is extracted.
- the editing history information may include editing history information indicating that editing has not been performed, as a record of editing operations. In this case, the information stored in association with the section is the same as when the editing operation is performed, and information indicating no editing is recorded as the editing operation content.
- editing history information that is not edited that is, information in which an unedited section is explicitly described as unedited
- information on the unedited section can be used as it is as unedited section information.
- the unedited section information generation unit 12 associates unedited sections that match before and after editing the content based on the unedited section information.
- this association information generation method is a method of generating from the order of unedited sections and section lengths, extracting image / sound characteristics of unedited sections, and comparing the contents before editing with the contents after editing.
- the former uses the fact that the order of unedited sections and the section length do not change before and after editing, and associates unedited sections with the same section length in time order between the pre-edit content and post-edit content. In the latter, image / sound features are extracted before and after unediting, and the unedited sections with the most similar image / sound features are associated with each other.
- the unedited section associated in this way generates unedited section association information in which a pair of a content ID in which each unedited section exists and a section ID newly assigned to the unedited section is paired.
- the edit information display screen generation unit 4 represents an unedited section component as an unedited section component in the screen information generated by the editing information presentation device 1 of the first embodiment. This unedited section component is associated with unedited section information.
- the editing section and editing group component presented by the presentation unit 6 of the editing information presentation device 1 of the first embodiment, and the correspondence between the unedited section component and the unedited section, which are in a corresponding relationship therewith Generates screen information with added information. Then, based on the start time and end time of the edited section and the unedited section, the section components are arranged in time order.
- the present embodiment it is possible to display the edited content in an aggregated form based on the association information between the groups in the content before and after editing, and to grasp the summary of the edited content. Even if the editing history information increases, it is easy to check the editing contents.
- the association information of the section / group of the content before editing and the section / group of the edited content is extracted, and a plurality of pieces of correspondence information are displayed, so that the intention of editing becomes clearer. I can understand.
- the replacement of the sections is known from the correspondence between the two edited sections that have been moved, and the replacement of the sections is known when only one editing section is sandwiched between the unedited sections.
- the editing information can be partially aggregated and expanded, so that the editing information of the content can be easily confirmed.
- a part of the editing history information is aggregated, a sufficient space is provided for displaying other editing history information, and the editing history information is displayed in large detail in detail using the space, so that the user can easily confirm the editing information. become.
- editing information that is not noticed by the user can be collected in a space-saving manner.
- the content change before and after editing can be used as a framework for organizing and consolidating.
- the editing section using the fact that the content consists of only two sections, the editing section and the unedited section, the editing sections that are continuous in time order are collected, and the contents before editing and the contents after editing are associated with each other. By taking this, it is possible to take a form in which editing information is organized by unedited sections.
- association information in an aggregated form by extracting the correspondence between the edit sections, between the edit sections and the edit groups, and between the edit groups. Confirmation of information becomes easy.
- the present embodiment by changing the display method of the association information according to the input, the information on the location that the user is interested in is presented in detail, and the information on the location that is not noted by the user is displayed in a space-saving manner Therefore, it is easy to display the information of the part that the user pays attention in a sufficient space.
- the present invention is configured as described above, but is not limited to the above-described embodiments, and various modifications can be made within the scope of the gist of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Television Signal Processing For Recording (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
Abstract
Description
編集前コンテンツ内において修正が行われない区間に挟まれていて修正が行われる複数の区間から構成される部分と、編集後コンテンツ内において修正が行われていない区間に挟まれた修正が行われた複数の区間から構成される部分との間において、互いに対応する複数の区間の対応付けをする編集区間情報取得手段と、
前記複数の区間の対応付けを1つの対応関係として提示する提示手段と、
を備えることを特徴とする編集情報提示装置を提供する。
編集前コンテンツ内において修正が行われない区間に挟まれていて修正が行われる複数の区間から構成される部分と、編集後コンテンツ内において修正が行われていない区間に挟まれた修正が行われた複数の区間から構成される部分との間において、互いに対応する複数の区間の対応付けをする編集区間情報取得工程と、
前記複数の区間の対応付けを1つの対応関係として提示する提示工程と、
を有することを特徴とする編集情報提示方法が提供される。
まず、本発明の第1実施形態に係る編集情報提示装置1の構成及び機能について、図1を参照して説明する。なお、図1は、第1実施形態に係る編集情報提示装置1の概略構成の一例を示す図である。
次に、本発明の第2実施形態に係る編集情報提示装置について、図面を用いて説明する。
次に、本発明の第3実施形態に係る編集情報提示装置について、図面を用いて説明する。
次に、本発明の第4実施形態に係る編集情報提示装置について、図面を用いて説明する。
2 編集区間情報取得部
3 編集前コンテンツ・編集後コンテンツ間対応関係取得部
4 編集情報表示画面生成部
5 入力部
6 提示部
7 編集履歴重要度判定部
8 編集区間情報生成部
9 コンポーネント生成・対応付け部
10 コンポーネント・コンポーネント対応付け選択部
11 レイアウト設定部
12 未編集区間情報生成部
100 編集履歴情報蓄積部
Claims (15)
- 時間軸を有するコンテンツに対して区間毎に修正を行う編集処理における編集前後の対応関係を提示する編集情報提示装置であって、
編集前コンテンツ内において修正が行われない区間に挟まれていて修正が行われる複数の区間から構成される部分と、編集後コンテンツ内において修正が行われていない区間に挟まれた修正が行われた複数の区間から構成される部分との間において、互いに対応する複数の区間の対応付けをする編集区間情報取得手段と、
前記複数の区間の対応付けを1つの対応関係として提示する提示手段と、
を備えることを特徴とする編集情報提示装置。 - 前記編集区間情報取得手段は、前記編集前コンテンツから前記編集後コンテンツを生成した際に生成された編集履歴情報に基づいて、前記編集前コンテンツの修正が行われる区間から前記編集後コンテンツの修正が行われた区間への対応付けをすることを特徴とする請求項1に記載の編集情報提示装置。
- 前記編集履歴情報には、前記編集処理毎に当該編集処理を適用した区間の時間情報が含まれており、
前記編集区間情報取得手段は、前記編集履歴情報の前記時間情報に基づいて、時間的に連続する前記修正が行われた複数の区間を求めることにより、前記編集前コンテンツにおいて修正が行われる区間が、前記修正が行われない区間に挟まれていることの判定を行うことを特徴とする請求項2に記載の編集情報提示装置。 - 前記編集前コンテンツ及び前記編集後コンテンツそれぞれにおいて、修正が行われた区間を編集区間として算出し、前記編集前コンテンツ及び前記編集後コンテンツの夫々に対して、時間的に連続する編集区間同士をグループ化して編集グループを生成する編集前コンテンツ・編集後コンテンツ間対応関係取得手段と、
前記編集履歴情報から前記編集前コンテンツの編集区間と前記編集後コンテンツの編集区間との対応関係を示す編集区間対応付け情報を抽出し、前記編集区間対応付け情報に基づいて前記編集前コンテンツと前記編集後コンテンツとの間で編集グループ同士の対応関係を示す編集グループ間対応付け情報を抽出し、前記編集区間対応付け情報に基づいて前記編集前コンテンツと前記編集後コンテンツとの間で編集区間と編集グループとの対応関係を示す区間・グループ間対応付け情報を抽出し、前記編集区間と前記編集グループとによって前記編集前コンテンツの構造と前記編集後コンテンツの構造を表現し、前記編集区間対応付け情報、前記編集グループ間対応付け情報、及び前記区間・グループ間対応付け情報の少なくとも何れか1つを用いて、前記編集前コンテンツと前記編集後コンテンツとの間の対応関係を前記提示手段に提示させる編集情報表示画面生成手段と、
を備えることを特徴とする請求項2に記載の編集情報提示装置。 - ユーザからの入力を受け付ける入力手段を更に備え、
前記編集情報表示画面生成手段は、前記入力手段からの入力に従って、前記編集区間対応付け情報、前記編集グループ間対応付け情報、及び前記区間・グループ間対応付け情報の何れか1つを選択して用いて、前記編集前コンテンツと前記編集後コンテンツとの間の対応関係を、前記編集区間同士の対応関係、前記編集区間と前記編集グループの間での対応関係、又は前記編集グループ同士の対応関係のいずれかで前記提示手段に提示させることを特徴とする請求項4に記載の編集情報提示装置。 - 前記編集区間情報取得手段は、前記編集前コンテンツと前記編集後コンテンツとの対応関係を提示する際に前記編集区間の長さと前記編集区間の編集処理の内容とから前記編集区間の重要度を算出し、当該算出された編集区間の重要度が所定の閾値より低い場合には、当該編集処理が行われた編集区間をサブ区間へ変更し、前記編集前コンテンツと前記編集後コンテンツを提示する際に前記サブ区間を隠蔽することを特徴とする請求項4に記載の編集情報提示装置。
- 前記編集情報表示画面生成手段は、前記編集前コンテンツと前記編集後コンテンツとの間で対応関係にある区間・グループを再生順序で並べた後に、前記編集前コンテンツと前記編集後コンテンツの少なくとも一方において区間とグループの間に空白を挿入することにより、再生順序と垂直に交わる方向において互いに対応する区間・グループの位置を揃えて、前記編集前コンテンツと前記編集後コンテンツとの間の対応関係を前記提示手段に提示させることを特徴とする請求項4に記載の編集情報提示装置。
- 前記編集区間情報取得手段は、前記修正が行われていない区間を未編集区間として抽出し、当該未編集区間の対応関係を抽出して、当該未編集区間に挟まれる区間で前記編集前コンテンツと前記編集後コンテンツとの対応関係を前記提示手段に提示させることを特徴とする請求項1に記載の編集情報提示装置。
- 前記編集区間情報取得手段は、前記編集前コンテンツと前記編集後コンテンツの間で編集区間情報の差分を取ることにより、前記編集前コンテンツと前記編集後コンテンツとの間の前記未編集区間の抽出を行うことを特徴とする請求項8に記載の編集情報提示装置。
- 前記編集区間情報取得手段は、前記編集前コンテンツから前記編集後コンテンツを生成した際に生成された編集履歴情報に、編集をしていない区間である未編集区間に関する情報が含まれている場合、当該未編集区間に関する情報を用いて、前記編集前コンテンツと前記編集後コンテンツとの間の前記未編集区間の抽出を行うことを特徴とする請求項8に記載の編集情報提示装置。
- 前記編集区間情報取得手段は、前記編集前コンテンツと前記編集後コンテンツとに存在する未編集区間が並ぶ順序で、前記編集前コンテンツと前記編集後コンテンツとの間で同一の区間長さを有する未編集区間同士を対応させることにより、前記編集前コンテンツと前記編集後コンテンツとの間の前記未編集区間の対応付け情報の抽出を行うことを特徴とする請求項8に記載の編集情報提示装置。
- 前記編集区間情報取得手段は、前記編集前コンテンツと前記編集後コンテンツとの間で各未編集区間を映像又は音響特徴量による照合を行い、同一の未編集区間を対応付けることによって、前記編集前コンテンツと前記編集後コンテンツとの間の前記未編集区間の対応付け情報を算出することを特徴とする請求項8に記載の編集情報提示装置。
- 時間軸を有するコンテンツを区間毎に修正を行う編集処理のためのコンテンツにおける編集前後の対応関係を提示する編集情報提示装置における編集情報提示方法であって、
編集前コンテンツ内において修正が行われない区間に挟まれていて修正が行われる複数の区間から構成される部分と、編集後コンテンツ内において修正が行われていない区間に挟まれた修正が行われた複数の区間から構成される部分との間において、互いに対応する複数の区間の対応付けをする編集区間情報取得工程と、
前記複数の区間の対応付けを1つの対応関係として提示する提示工程と、
を有することを特徴とする編集情報提示方法。 - コンピュータを、
請求項1乃至12の何れか一項に記載の編集情報提示装置として機能させることを特徴とするプログラム。 - 請求項14に記載のプログラムがコンピュータに読み取り可能に記録されていることを特徴とする記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2009801527194A CN102265610A (zh) | 2008-12-25 | 2009-12-25 | 已编辑信息提供设备、已编辑信息提供方法、程序以及存储介质 |
US12/998,960 US8819558B2 (en) | 2008-12-25 | 2009-12-25 | Edited information provision device, edited information provision method, program, and recording medium |
JP2010543907A JPWO2010073695A1 (ja) | 2008-12-25 | 2009-12-25 | 編集情報提示装置、編集情報提示方法、プログラム、及び記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008329699 | 2008-12-25 | ||
JP2008-329699 | 2008-12-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010073695A1 true WO2010073695A1 (ja) | 2010-07-01 |
Family
ID=42287328
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/007241 WO2010073695A1 (ja) | 2008-12-25 | 2009-12-25 | 編集情報提示装置、編集情報提示方法、プログラム、及び記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8819558B2 (ja) |
JP (1) | JPWO2010073695A1 (ja) |
CN (1) | CN102265610A (ja) |
WO (1) | WO2010073695A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012108090A1 (ja) * | 2011-02-10 | 2012-08-16 | 日本電気株式会社 | 映像間対応関係表示システム及び映像間対応関係表示方法 |
WO2012108089A1 (ja) * | 2011-02-10 | 2012-08-16 | 日本電気株式会社 | 映像間対応関係表示システム及び映像間対応関係表示方法 |
JP2013080989A (ja) * | 2011-09-30 | 2013-05-02 | Jvc Kenwood Corp | 動画編集装置、動画編集方法およびコンピュータプログラム |
WO2017039040A1 (ko) * | 2015-09-04 | 2017-03-09 | (주)마인드퀘이크 | 컨텐츠 제공 방법 및 컨텐츠 제공 장치 |
JP2018136389A (ja) * | 2017-02-21 | 2018-08-30 | 日本放送協会 | 音声データの比較処理プログラム |
WO2021019645A1 (ja) * | 2019-07-29 | 2021-02-04 | 日本電気株式会社 | 学習データ生成装置、学習装置、識別装置、生成方法及び記憶媒体 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109857293B (zh) * | 2018-12-28 | 2021-04-13 | 维沃移动通信有限公司 | 显示方法及终端设备 |
JP2021106369A (ja) * | 2019-12-27 | 2021-07-26 | 京セラドキュメントソリューションズ株式会社 | 情報処理装置及び画像形成装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001203979A (ja) * | 2000-01-24 | 2001-07-27 | Mitsubishi Electric Corp | 記録再生装置、記録再生装置における編集方法およびその方法をコンピュータに実行させるプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2004040767A (ja) * | 2002-03-21 | 2004-02-05 | Canon Inc | デュアルモード型のタイムライン・インタフェース |
JP2005094391A (ja) * | 2003-09-18 | 2005-04-07 | Pioneer Electronic Corp | データ編集記録装置、データ編集記録方法、並びに、データ編集記録プログラムおよびそれを記録した記録媒体 |
JP2007336106A (ja) * | 2006-06-13 | 2007-12-27 | Osaka Univ | 映像編集支援装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6670966B1 (en) * | 1998-11-10 | 2003-12-30 | Sony Corporation | Edit data creating device and edit data creating method |
US7020381B1 (en) * | 1999-11-05 | 2006-03-28 | Matsushita Electric Industrial Co., Ltd. | Video editing apparatus and editing method for combining a plurality of image data to generate a series of edited motion video image data |
US7305381B1 (en) * | 2001-09-14 | 2007-12-04 | Ricoh Co., Ltd | Asynchronous unconscious retrieval in a network of information appliances |
JP4914278B2 (ja) | 2007-04-16 | 2012-04-11 | 富士フイルム株式会社 | 画像処理装置、方法およびプログラム |
-
2009
- 2009-12-25 US US12/998,960 patent/US8819558B2/en active Active
- 2009-12-25 WO PCT/JP2009/007241 patent/WO2010073695A1/ja active Application Filing
- 2009-12-25 JP JP2010543907A patent/JPWO2010073695A1/ja active Pending
- 2009-12-25 CN CN2009801527194A patent/CN102265610A/zh active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001203979A (ja) * | 2000-01-24 | 2001-07-27 | Mitsubishi Electric Corp | 記録再生装置、記録再生装置における編集方法およびその方法をコンピュータに実行させるプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2004040767A (ja) * | 2002-03-21 | 2004-02-05 | Canon Inc | デュアルモード型のタイムライン・インタフェース |
JP2005094391A (ja) * | 2003-09-18 | 2005-04-07 | Pioneer Electronic Corp | データ編集記録装置、データ編集記録方法、並びに、データ編集記録プログラムおよびそれを記録した記録媒体 |
JP2007336106A (ja) * | 2006-06-13 | 2007-12-27 | Osaka Univ | 映像編集支援装置 |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012108090A1 (ja) * | 2011-02-10 | 2012-08-16 | 日本電気株式会社 | 映像間対応関係表示システム及び映像間対応関係表示方法 |
WO2012108089A1 (ja) * | 2011-02-10 | 2012-08-16 | 日本電気株式会社 | 映像間対応関係表示システム及び映像間対応関係表示方法 |
US9172936B2 (en) | 2011-02-10 | 2015-10-27 | Nec Corporation | Inter-video corresponding relationship display system and inter-video corresponding relationship display method |
JP5854232B2 (ja) * | 2011-02-10 | 2016-02-09 | 日本電気株式会社 | 映像間対応関係表示システム及び映像間対応関係表示方法 |
US9473734B2 (en) | 2011-02-10 | 2016-10-18 | Nec Corporation | Inter-video corresponding relationship display system and inter-video corresponding relationship display method |
JP6037443B2 (ja) * | 2011-02-10 | 2016-12-07 | 日本電気株式会社 | 映像間対応関係表示システム及び映像間対応関係表示方法 |
JP2013080989A (ja) * | 2011-09-30 | 2013-05-02 | Jvc Kenwood Corp | 動画編集装置、動画編集方法およびコンピュータプログラム |
WO2017039040A1 (ko) * | 2015-09-04 | 2017-03-09 | (주)마인드퀘이크 | 컨텐츠 제공 방법 및 컨텐츠 제공 장치 |
JP2018136389A (ja) * | 2017-02-21 | 2018-08-30 | 日本放送協会 | 音声データの比較処理プログラム |
WO2021019645A1 (ja) * | 2019-07-29 | 2021-02-04 | 日本電気株式会社 | 学習データ生成装置、学習装置、識別装置、生成方法及び記憶媒体 |
JPWO2021019645A1 (ja) * | 2019-07-29 | 2021-02-04 | ||
JP7268739B2 (ja) | 2019-07-29 | 2023-05-08 | 日本電気株式会社 | 学習データ生成装置、学習装置、識別装置、生成方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2010073695A1 (ja) | 2012-06-07 |
US20110258546A1 (en) | 2011-10-20 |
US8819558B2 (en) | 2014-08-26 |
CN102265610A (zh) | 2011-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010073695A1 (ja) | 編集情報提示装置、編集情報提示方法、プログラム、及び記録媒体 | |
JP4607987B2 (ja) | エンハンス・コンテンツを有するタイム・ベース・メディアの編集 | |
US8196032B2 (en) | Template-based multimedia authoring and sharing | |
US20030018609A1 (en) | Editing time-based media with enhanced content | |
US20100003006A1 (en) | Video searching apparatus, editing apparatus, video searching method, and program | |
US20090327856A1 (en) | Annotation of movies | |
US8631047B2 (en) | Editing 3D video | |
JPH07182365A (ja) | マルチメディア会議録作成支援装置および方法 | |
CN104516861B (zh) | 多媒体互动文档处理方法 | |
JPH11162107A (ja) | デジタルビデオ情報及びオーディオ情報を編集するためのシステム | |
WO2013016312A1 (en) | Web-based video navigation, editing and augmenting apparatus, system and method | |
CN103324513B (zh) | 程序注释方法和装置 | |
CN101110930A (zh) | 记录控制装置和记录控制方法,以及程序 | |
DE10393469T5 (de) | Optische Platte, Wiedergabevorrichtung, Programm, Wiedergabeverfahren und Aufzeichnungsverfahren | |
JP2006042317A (ja) | アーカイブ管理装置、アーカイブ管理システム及びアーカイブ管理プログラム | |
US11942117B2 (en) | Media management system | |
GB2520041A (en) | Automated multimedia content editing | |
JP6603929B1 (ja) | 動画編集サーバおよびプログラム | |
WO2020201297A1 (en) | System and method for performance-based instant assembling of video clips | |
JP2001502858A (ja) | ディジタル音声及び画像情報の符号化データのデータベースを有したディジタル画像システム | |
JP2002008052A (ja) | プレゼンテーションシステムおよび記録媒体 | |
CN106527845B (zh) | 对文本中鼠标操作进行语音注释并再现的方法及装置 | |
JP2006157687A (ja) | 視聴者間コミュニケーション方法及び装置及びプログラム | |
KR20140051115A (ko) | 미디어 파일들의 이벤트의 로그 | |
JP2005167822A (ja) | 情報再生装置及び情報再生方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980152719.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09834501 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12998960 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2010543907 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09834501 Country of ref document: EP Kind code of ref document: A1 |