WO2010113211A1 - アニメーション編集装置及びアニメーション再生装置 - Google Patents
アニメーション編集装置及びアニメーション再生装置 Download PDFInfo
- Publication number
- WO2010113211A1 WO2010113211A1 PCT/JP2009/001497 JP2009001497W WO2010113211A1 WO 2010113211 A1 WO2010113211 A1 WO 2010113211A1 JP 2009001497 W JP2009001497 W JP 2009001497W WO 2010113211 A1 WO2010113211 A1 WO 2010113211A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- animation
- data
- editing
- tag
- timeline
- Prior art date
Links
- 230000002123 temporal effect Effects 0.000 claims abstract description 9
- 238000000034 method Methods 0.000 claims description 62
- 238000007726 management method Methods 0.000 claims description 47
- 230000008569 process Effects 0.000 claims description 45
- 238000013500 data storage Methods 0.000 claims description 27
- 238000004364 calculation method Methods 0.000 claims description 20
- 238000013507 mapping Methods 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 23
- 238000012545 processing Methods 0.000 description 11
- 230000008859 change Effects 0.000 description 7
- 230000004044 response Effects 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 101001016600 Equus caballus Sperm histone P2b Proteins 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- the present invention relates to an animation editing apparatus for creating a user interface and movie content using animation by an animation description model based on interpolation between key frames, and an animation reproducing apparatus for reproducing the animation.
- the screen states at multiple times on the timeline which is the reference point for animation, have been defined as key frames.
- the screen state is a display position or a display state of components (objects) constituting the screen.
- an animation is created by defining a method for interpolating changes in display positions and display states of parts (parts arranged in key frames) over time between key frames.
- Patent Document 1 does not directly associate a key frame with a time, but associates a symbol that abstractly represents time with the key frame and associates the symbol with the time separately, thereby editing the key frame and the time.
- An animation editing technique is disclosed in which the above is separated to improve the efficiency of the animation editing process.
- the change in the display position of the part and the change in the display state are both defined based on the display time on the timeline. For this reason, if you want to change the display state (display color, etc.) according to the display position of the part, it is necessary to associate the desired display position and display state with the same time on the timeline. There is a problem that the editing processing becomes complicated because the time editing must be performed simultaneously.
- the present invention has been made to solve the above-described problems, and can easily edit the position of the component on the screen and the display state of the component at that position in association with each other, and also from the editing result.
- An object of the present invention is to obtain an animation editing apparatus that can easily understand the correspondence between display states and an animation reproducing apparatus that reproduces the animation.
- the layout of key frames serving as a reference for animation, the arrangement of tags in key frames, and the contents of interpolation between key frames are defined.
- the key frame layout and key frame based on the timeline data and the space line that shows the relative positional relationship between the display position of the animation part and the reference position indicated by the tag in a one-dimensional line.
- Data storage means for storing animation data including tag line layout and space line data in which interpolation contents between key frames are defined, and for timeline data read from the data storage means in accordance with an input editing instruction Keyframes to be placed on the timeline
- Timeline editing means for setting keyframes on the timeline and setting interpolation contents between keyframes on the timeline, and spaceline data read from the data storage means in accordance with the input editing instructions.
- Space line editing means for creating key frames to be arranged in the space line, placing key frames in the space line and setting interpolation contents between the key frames of the space line, and data storing means in accordance with the inputted editing instruction Tag editing means for setting a tag at the position instructed to edit the space line with respect to the space line data read out from the key line, and key frames in the animation data read out from the data storage means in accordance with the inputted editing instruction Tag placement hand that sets a tag at the position where editing is instructed And the timeline and spaceline defined by the timeline data and spaceline data included in the animation data to be edited read out from the data storage means, and the frame contents based on these timelines are accepted, and the editing instruction is accepted, and the presented contents
- the timeline editing unit, the spaceline editing unit, the tag editing unit, and the tag arranging unit execute an editing process, and an animation editing management unit that presents an animation editing result is provided. .
- the timeline, spaceline, and frame contents based on these are specified, and editing instructions are accepted. Timeline editing, spaceline editing, tag editing, and tag editing are performed according to the editing instructions input according to the presentation contents. To run the tag location.
- the animation image position on the display screen and the display state at the position can be easily associated with each other and edited, so that the animation editing efficiency can be improved.
- FIG. 6 is a flowchart showing the operation of the animation editing apparatus according to the first embodiment. It is a figure which shows the main operation screen of the animation editing apparatus by Embodiment 1.
- FIG. 6 is a flowchart which shows the flow of the process in the edit process step in FIG. 6 is a flowchart showing a flow of timeline editing processing of the animation editing apparatus according to the first embodiment. It is a figure which shows the specific example of a timeline edit process. It is a figure which shows the other example of a timeline edit process.
- FIG. 6 is a flowchart showing a flow of space line editing processing of the animation editing apparatus according to the first embodiment. It is a figure which shows the specific example of a space line edit process. It is a figure which shows the specific example of the tag arrangement
- FIG. It is a figure which shows an example of the data structure of animation data. It is a figure which shows an example of the data structure of timeline data. It is a figure which shows an example of the data structure of key frame data. It is a figure which shows an example of the data structure of interpolation setting data. It is a figure which shows an example of the data structure of space line data. It is a figure which shows an example of the data structure of tag data.
- FIG. 10 is a diagram showing a main operation screen of the animation editing apparatus according to the third embodiment. It is a figure which shows the tag arrangement
- FIG. 10 is a diagram showing a main operation screen of the animation editing apparatus according to the third embodiment. It is a figure which shows the tag arrangement
- FIG. 1 is a block diagram showing a configuration of an animation editing apparatus according to Embodiment 1 of the present invention.
- an animation editing management unit 101 reads and writes animation data to and from a storage medium 106, inputs an editing instruction from an editor (user), and presents an editing result to the editor.
- An input device such as a keyboard or a mouse is used to input an editing instruction.
- an output device such as a display or a speaker is used for outputting the editing result.
- the timeline editing unit 102 creates a key frame, arranges parts on the key frame, arranges the key frame on the time line (time axis) indicating the temporal display order of the frames, and between the key frames.
- the interpolation content it is set how the display position and display state of the components arranged in the key frame are changed over time between the key frames. For example, when changing the display position of a component by a constant movement amount, linear interpolation based on the movement amount is set.
- a method for setting a key frame on the timeline for example, there are the following methods (only one of these may be used, or both may be used).
- the timeline is data representing the temporal relationship (display time axis) of the frames constituting the animation.
- the temporal display order of the frames constituting the animation is set, and among these frames, the frames designated by editing are arranged as key frames. That is, the frame includes both the key frame and the interpolation frame.
- the space line editing unit 103 creates a key frame, places a part on the key frame, places a key frame on the space line, and interpolates the content of the part between the key frames on the space line and the content of the part display state Set.
- the space line is data indicating the relative positional relationship between the reference position indicated by the tag and the display position of the animation component (arranged in the key frame of other animation data) on a one-dimensional straight line. .
- the space line is fundamental because the distance between the part and the tag described later in Embodiment 2 cannot be calculated unless the part having the space line and the tag defined by the part are arranged on the same frame. Used only for animation data used as parts.
- the tag editing means 104 defines a tag at the designated position on the space line. For example, when tag placement is instructed via the animation editing management unit 101 for a predetermined frame on the space line displayed on the main operation screen, the tag editing unit 104 is instructed in accordance with the editing instruction. Set a tag at the frame position.
- the tag placement unit 105 generates and places a tag at a designated position on the screen of the key frame. For example, a tag instructed via the animation editing management means 101 is generated, and the tag is arranged at the instructed position on the key frame screen.
- a tag may be arranged on the screen of an arbitrary frame other than the key frame.
- the tag may be moved by animation.
- Tags placed in key frames are inherited in interpolation frames. Thus, when the interpolation frame is displayed, it can be confirmed that the tag is arranged at the same position as the immediately preceding key frame.
- the storage medium (data storage means) 106 holds the animation data created by the animation edit management means 101.
- the animation data 107 includes time line data 108 and space line data 109 as shown in FIG. 1 in addition to the data of each frame constituting the animation.
- the timeline data 108 is data defining each frame of the animation based on the timeline edited by the timeline editing unit 102, and includes a key frame 110 and interpolation setting data 112.
- the data of the key frame 110 is layout data of a screen (key frame) serving as an animation key, and includes tag arrangement data 111 indicating tag arrangement.
- the tag arrangement data 111 is data indicating the screen arrangement of tags in the key frame.
- the interpolation setting data 112 is data indicating the interpolation contents of the component positions and component display states between key frames.
- the space line data 109 is data that defines each frame of the animation based on the space line edited by the space line editing unit 103, and includes tag data 113 in addition to the key frame 110 and the interpolation setting data 112.
- the tag data 113 is data that defines the tags arranged in the space line.
- FIG. 2 is a diagram showing a hardware configuration of the animation editing apparatus according to the first embodiment.
- the animation editing apparatus according to Embodiment 1 is constructed on a computer as shown in FIG. That is, the animation editing management unit 101, the timeline editing unit 102, the spaceline editing unit 103, the tag editing unit 104, and the tag arrangement unit 105 cause the CPU 202 to read the animation editing program according to the gist of the present invention.
- the animation editing management unit 101, the timeline editing unit 102, the spaceline editing unit 103, the tag editing unit 104, and the tag arrangement unit 105 cause the CPU 202 to read the animation editing program according to the gist of the present invention.
- the output device controller 205 displays an instruction selection screen, an editing result, and the like via the display 206 and the speaker 208 and outputs sound, and in response to this, the input device controller 201 is operated by an input device such as the mouse 200 and the keyboard 207.
- an editing instruction to the animation editing management unit 101, the timeline editing unit 102, the spaceline editing unit 103, the tag editing unit 104, and the tag placement unit 105 is executed.
- the storage medium 106 can be constructed on the storage area of the hard disk 203 and the memory 204 built in the computer as a standard or on the storage medium of an external storage device.
- FIG. 3 is a flowchart showing the operation of the animation editing apparatus according to the first embodiment.
- the animation editing operation will be described with reference to this figure.
- the animation editing management unit 101 enters an input waiting state for accepting an editing instruction from the outside.
- an editing instruction is input from an editor (user) using an input device such as a keyboard or a mouse
- the animation editing management means 101 receives the input editing instruction information (step ST1; editing instruction input step) ).
- step ST2 determines whether or not the instruction input in step ST1 is an instruction to end editing (step ST2; editing end determination step). If it is an instruction to end editing (step ST2; YES), the operation of the animation editing apparatus is ended. If it is not an instruction to end editing (step ST2; NO), the process proceeds to step ST3.
- step ST3 the timeline editing unit 102, the spaceline editing unit 103, the tag editing unit 104, and the tag arrangement unit 105 perform an animation data editing process (animation) according to the editing instruction information input via the animation editing management unit 101.
- Step ST3 editing process step
- FIG. 4 is a diagram showing a main operation screen of the animation editing apparatus according to the first embodiment, which is a UI (User Interface) provided by the animation editing management means 101.
- a main operation screen 601 that is an animation editing window is a user interface for operating all functions in the animation editing apparatus according to the first embodiment.
- a main operation screen 601 is displayed on a display screen (not shown). On this main operation screen 601, an editing instruction is received from the animation editing apparatus in step ST1, and the animation editing apparatus The processing result is presented.
- the menu bar 602 selects a standard file editing function (for example, creation of a new editing file, opening, saving, data copying and pasting, display screen setting) and editing for execution.
- the content item is displayed.
- an editing instruction is given, an item in the menu bar 602 is selected using the input device, whereby an editing process corresponding to the item is executed.
- a timeline 604 displays the timeline state of the animation data being edited.
- a time axis with the right direction as the positive direction is taken as the time line 604, and one scale corresponds to one unit time.
- frames for each unit time are arranged in time series on the time axis.
- a display position cursor 605 indicated by a broken line in the time line 604 or the space line 609 indicates a frame being displayed on the time line 604 or the space line 609.
- a key frame display 606 indicated by a thick line in the time line 604 or the space line 609 indicates a key frame arranged on the time line 604 or the space line 609.
- An interpolation display 607 indicates that the component arrangement and the component display state between the key frames indicated by the key frame display 606 are interpolated and the interpolation method. In the example of FIG. 4, linear interpolation is set as the interpolation content.
- step ST3 the timeline editing unit 102 edits the timeline of the animation to be processed in accordance with the timeline editing instruction input via the animation editing management unit 101, based on the timeline 604 of the main operation screen 601. .
- the timeline editing process will be described in detail with reference to FIG.
- space line display area 608 information related to the space line is displayed.
- a space line 609 displays the state of the space line of the animation data being edited.
- different tags are arranged in two key frames in the space line 609, and an axis corresponding to the distance from the tag is taken in the left-right direction.
- the tag 610 is an identifier representing an arbitrary position on the space line.
- a name is displayed on the tag, and each tag is identified by the name.
- a tag display of the tag name “light” and a tag display of the tag name “dark” are arranged in the key frame on the space line 609 by the tag editing unit 104.
- the frame display area 611 a frame specified by the display position cursor 605 of the time line 604 and the space line 609 is displayed.
- step ST3 the space line editing unit 103 edits the space line in the animation to be edited in accordance with the space line editing instruction input by the animation editing management unit 101 based on the space line 609 on the main operation screen 601.
- the space line editing process will be described in detail with reference to FIG.
- the tag editing unit 104 selects a tag desired by the editor instructed via the animation editing management unit 101 based on the space line 609 on the main operation screen 601 in step ST3.
- the above tag is associated with.
- the tag placement unit 105 generates and places a tag at the screen position of the key frame instructed by the animation editing management unit 101.
- the tag placement unit 105 executes editing processing.
- the change in the position of the component and the display state with the passage of time between the key frames can be specified from the interpolation content.
- the space line the change in the position of the part and the display state depending on the distance between the tag and the animation data part on the frame screen where the animation data is arranged as the part can be specified from the interpolation content.
- the timeline and spaceline specify the display position (frame to be displayed) due to different factors, so the animation data to be edited may have both the timeline and spaceline, or either one You may only have However, the same parameter of the same part cannot be changed from both the time line and the space line.
- processing such as changing the X coordinate of the same part on the space line while changing the X coordinate of the part on the screen on the time line (moving in the X-axis direction on the screen) ,
- the X coordinate of the screen) and the parameter determined by the space line cannot be determined as the X coordinate.
- the animation editing management means 101 updates the display on the display, the output sound to the speaker, and the like based on the animation data changed in step ST3 (step ST4; editing result output step). . Thereafter, the process returns to the process of step ST1 and again enters the state of waiting for an input of an editing instruction, and the process from step ST2 is repeated.
- FIG. 5 is a flowchart showing a process flow in the editing process step in FIG. 3, and the outline of the process will be described with reference to this figure.
- the animation editing management means 101 displays the main operation screen 601 shown in FIG. 4 and enters an editing instruction waiting state.
- the animation editing management unit 101 determines an editing process corresponding to the item for which the editing instruction has been given. (Step ST1a).
- the storage method of the animation data in the storage medium 106 in the present invention is not a general method in a personal computer (hereinafter abbreviated as a PC) that opens, edits, and saves a file.
- a PC personal computer
- a method of saving in the storage medium 106 without requiring an explicit save instruction is adopted. This is because each editing means of the animation editing apparatus according to the present invention directly edits data on the storage medium 106 as well as animation data loaded on the memory. By adopting this method, the data structure handled by each editing means matches the data structure on the storage medium 106.
- step ST1a when “new creation” (not shown in FIG. 4) in the menu bar 602 of the main operation screen 601 is selected, the animation editing management means 101 starts a new animation creation process (step ST2a).
- the animation editing management means 101 discards the animation data being edited in preparation for creating a new animation, creates an empty timeline and key frames, and creates an area for storing animation data on the storage medium 106. Secure.
- the animation editing management unit 101 edits each existing animation data designated by the “open” operation. It is set as the editing target of the means (step ST3a). At this time, in preparation for editing the existing animation data, the animation data being edited is discarded, and the animation data 107 designated by the “open” operation is read from the storage medium 106 and converted into an editable data format. .
- the animation editing management unit 101 ends editing of the animation data designated by the “close” operation (step ST4a).
- the animation editing management unit 101 When the timeline 604 of the main operation screen 601 is operated, the animation editing management unit 101 outputs the operation content to the timeline editing unit 102.
- the timeline editing unit 102 reads the timeline data 108 of the animation data 107 that has been instructed to edit from the storage medium 106, and performs an operation based on the input operation content on the timeline data 108 (step ST5a).
- the animation editing management unit 101 When the space line 609 on the main operation screen 601 is operated, the animation editing management unit 101 outputs the operation content to the space line editing unit 103.
- the space line editing means 103 reads the space line data 109 of the animation data 107 for which an editing instruction has been given from the storage medium 106, and performs an operation based on the input operation content on the space line data 109 (step ST6a). If the instruction is to define a tag on the space line 609, the space line editing unit 103 activates the tag editing unit 104 and puts the tag with the specified name at the specified position on the space line 609. Set.
- the animation editing management unit 101 outputs the operation content to the tag arrangement unit 105.
- the tag placement unit 105 generates and places a tag with the designated name at the designated position on the key frame screen (step ST7a).
- the animation editing management unit 101 ends the editing process step and proceeds to an editing result output step shown in FIG.
- FIG. 6 is a flowchart showing the flow of the timeline editing process of the animation editing apparatus according to the first embodiment, and shows details of the process in step ST5a of FIG. Details of the processing will be described with reference to this figure.
- the timeline editing unit 102 determines the editing process according to the editing content input from the animation editing management unit 101 (step ST1b), and branches to the corresponding editing process.
- the timeline editing unit 102 When the editing content input from the animation editing management unit 101 is a new key frame creation, the timeline editing unit 102 generates a new key frame and initializes it to a usable state (a blank where no parts are arranged). (Step ST2b).
- the timeline editing unit 102 If the editing content is the arrangement of the component in the key frame, the timeline editing unit 102 generates the specified component at the indicated position of the key frame in the designated display state according to the content of the editing instruction. (Step ST3b).
- user interface parts such as buttons, labels, and check boxes can be designated as the parts. You can also specify animation data that has already been created.
- the timeline editing unit 102 sets various attributes corresponding to the display state of the part for the instructed part according to the content of the editing instruction (step ST4b).
- attributes the position and size of key frames, rotation, color, font, image file name, transformation matrix for transformation, and the like can be set.
- the settable attributes differ depending on the type of component (the type of object that can be specified as the component described above).
- the timeline editing means 102 places the key frame at the designated position on the timeline according to the content of the editing instruction (step ST5b).
- the timeline editing unit 102 sets the interpolation content between frames for the attribute of a component whose position and display state change between key frames according to the content of the editing instruction. (Step ST6b).
- the interpolation contents for example, interpolation contents such as linear interpolation and spline interpolation are designated.
- FIG. 7 is a diagram showing a specific example of the timeline editing process.
- FIG. 7A shows a timeline
- FIG. 7B shows each key frame in the timeline in FIG. 7A.
- the timeline 701 shown in FIG. 7A four key frames 702 to 705 are arranged, and linear interpolation is set between the key frames.
- the cursor parts in the screens of the key frames 702 to 705 are arranged at the position of the lower button one by one in time order on the timeline.
- An animation in which the cursor part moves from the top button to the bottom button is obtained.
- a display screen 706 displays a key frame 702, and a cursor component 710 is displayed at the “Home” button position on the screen.
- the key frame 703 corresponds to the display screen 707
- the key frame 704 corresponds to the display screen 708, and the key frame 705 corresponds to the display screen 709.
- FIG. 8A and 8B are diagrams showing another example of the timeline editing process.
- FIG. 8A shows a timeline
- FIG. 8B shows a frame between key frames in the timeline in FIG. 8A. ing.
- the display position cursor of the timeline 701 is moved to an intermediate point 801 between the key frame 702 and the key frame 703.
- the cursor component 710 arranged in the key frame becomes It is possible to display a screen (a frame of the intermediate point 801) in which the cursor component 710 is moved to an intermediate position where the “Home” button is moved to the “Search from map” button.
- the display state in each frame of the predetermined component (cursor component 710) arranged in the key frame is simply moved to the desired position on the timeline. (Position etc.) can be displayed easily.
- FIG. 9 is a flowchart showing the flow of the space line editing process of the animation editing apparatus according to the first embodiment, and shows details of the process in step ST6a of FIG.
- the space line editing unit 103 determines the editing process according to the editing content input from the animation editing management unit 101 (step ST1c), and branches to the corresponding editing process.
- the key frame editing process in step ST2c, step ST3c, step ST4c, and step ST6c is the same as the timeline editing process described above, and thus the description thereof is omitted.
- the space line editing unit 103 arranges the key frame at the instructed position on the space line according to the editing content (step ST5c).
- the space line editing unit 103 activates the tag editing unit 104.
- the tag editing unit 104 generates and places a tag at the instructed position on the space line according to the editing content (step ST7c).
- the tag is given a name as an identifier.
- FIG. 10 is a diagram showing a specific example of the space line editing process, and FIG. 10A shows the space line.
- FIG. 10A shows the space line.
- two key frames 902 and 903 are arranged, and a linear interpolation is set between the key frames.
- a tag 904 with a tag name “light” is arranged in the key frame 902
- a tag 905 with a tag name “dark” is arranged in the key frame 903.
- FIG. 10A the case where a tag is set in a key frame is shown, but a tag may be set in a position other than the key frame (for example, an intermediate point 908).
- the three cursors shown in FIG. 10B are arranged in a frame having the same size as each cursor.
- the focus position in the frame is not changed, and the editing is performed in which only the focus color is changed.
- the linear interpolation is performed so that the focus color of the cursor component in each frame between the key frames 902 and 903 changes with a constant color change.
- the cursor component 906 is displayed as a key frame 902. Since the cursor component 906 has the focus color set to white as the attribute of the cursor component, the focus is displayed in white. Further, the cursor component 907 appears to display a key frame 903, and the focus color is displayed in gray because the cursor component 907 sets the focus color to gray as an attribute of the cursor component.
- the cursor part 909 is displayed as shown in FIG. 9B by performing an operation of moving the display position cursor to the frame at the intermediate point 908 between the key frames 902 and 903 on the space line 901. Screen can be displayed.
- the cursor component 909 is displayed in light gray, which is an intermediate color between the cursor component 906 and the cursor component 907.
- the display state of each predetermined part (cursor part 909) arranged in the key frame can be changed by simply moving the display position cursor to a desired position on the space line. It can be displayed easily.
- FIG. 11 is a diagram showing a specific example of tag placement processing of the animation editing apparatus according to the first embodiment, and is a screen obtained by performing tag placement editing operation on the main operation screen in FIG. Is shown.
- the animation editing management means 101 performs tag placement means 105. Is activated to switch to the screen shown in FIG.
- the selected part (cursor part) becomes the selection display 1003.
- tags 1002 defined in the space line of the selected part are displayed in a list.
- a tag 1002 to be placed on the display screen is selected from the tags 1002 listed in the tag display area 1001, and the frame display area 611 is within the frame.
- the selected tag is placed at that position. Any number of tags can be placed in the frame.
- FIG. 11 a case has been shown in which a component placed on the screen is designated and a tag placement editing instruction is executed.
- a defined (created) animation component is designated by a file name or the like, and tag placement is performed.
- An editing instruction may be executed.
- FIG. 11 it is assumed that the tag is arranged on the key frame like other parts, but the tag may be arranged within an arbitrary frame range regardless of the key frame.
- FIG. 12 shows an example of the data structure of animation data.
- the animation data 107 has a plurality of attributes that represent an outline of the data.
- the animation name is a name given to the animation data.
- the length is data representing the total length of the animation in seconds.
- the frame rate represents the number of frames displayed per second.
- FIG. 13 is a diagram showing an example of the data structure of timeline data.
- the timeline data 108 has a time when a key frame is set in the timeline.
- a key frame whose frame ID is frame 0 is assigned 0.0 seconds after the start of the animation (the top) (each key frame is identified by a key frame ID).
- a key frame with a key frame ID of frame 1 is assigned to 3.0 seconds
- a key frame with a key frame ID of frame 2 is assigned to 5.2 seconds.
- FIG. 14 is a diagram showing an example of the data structure of key frame data.
- the key frame data 110 stores data of a plurality of key frames.
- a key frame ID is assigned to the data of each key frame.
- the key frame data includes a list of components or tags (specified by the tag arrangement data 111) arranged in the key frame.
- FIG. 14B shows the data of each key frame.
- a part name is given to each part, and the part name becomes an identifier of the part in the key frame.
- a component type is designated for each component.
- the part type is an image, text, animation data, tag, or the like.
- each component has data corresponding to its type.
- the common data includes coordinates X and Y in the part frame.
- a component having a size has a width W and a height H as data.
- the binary data is also set.
- the tag has the tag name as data.
- FIG. 15 is a diagram showing an example of the data structure of the interpolation setting data.
- the interpolation setting data 112 is set with an interpolation method in a frame following each key frame.
- the key frame ID a key frame for which an interpolation method is set is designated.
- the part name the part for which the interpolation method is set is specified in the key frame.
- the property a property to be interpolated in the part is specified.
- properties coordinates X and Y, width W and height H as sizes, and properties specific to parts can be specified.
- the interpolation method an interpolation method of the property value is specified.
- FIG. 16 is a diagram showing an example of the data structure of the space line data.
- the space line data 109 has, as data, a position where a key frame is set in the space line. This position is normalized with the head position being 0.0 and the tail position being 1.0.
- a key frame with a key frame ID of frame 0 is assigned to 0.0 at the head position
- a key frame with a key frame ID of frame 1 is assigned to 0.4 at the position
- 1 at the end position A key frame with a key frame ID of frame 2 is assigned to .0.
- FIG. 17 is a diagram illustrating an example of a data structure of tag data.
- the tag data 113 includes the tag position and name as data. This position is a position where the tag is defined in the space line.
- the name is a name that is an identifier attached to the tag.
- the timeline data 108 defining the layout of the key frame, the arrangement of the tag in the keyframe, and the interpolation content between the keyframes, and the spaceline.
- Animation data 107 including space line data 109 having tag data 113 defining the layout of the key frame, the arrangement of the tags in the key frame, and the interpolation contents between the key frames, and defining the tags arranged in the space line.
- Receiving an editing indication shows, in accordance with the editing instruction inputted in response to the contents of the display, performing a time-line editing, space line editing, editing tags and tag location.
- the animation image position on the display screen and the display state at that position can be easily associated and edited, so that the animation editing efficiency can be improved.
- FIG. FIG. 18 is a block diagram showing a configuration of an animation reproduction apparatus according to Embodiment 2 of the present invention.
- the animation playback apparatus according to the second embodiment has an animation execution management unit 1101 in addition to the configuration of the storage medium 106 shown in FIG. 1 and the animation data 107 stored therein.
- a timeline execution unit 1102 a spaceline execution unit 1103, and a tag / part distance calculation unit (distance calculation unit) 1104.
- the animation execution management means 1101 reads the animation data 107 from the storage medium 106, executes the animation data, displays the resulting animation on the screen, and presents it to the user.
- an output device (not shown) such as a display or a speaker is used to output the execution result.
- the timeline executing unit 1102 generates key frames, generates parts on the key frames, and interpolates between the key frames based on the information on the time line, and generates an animation image that is sequentially displayed at the time of execution.
- the space line execution means 1103 generates a key frame, generates parts on the key frame, and interpolates between the key frames based on the space line information, and generates an animation image that is sequentially displayed at the time of execution.
- the tag / part distance calculation unit 1104 calculates distance information between the tag and the part based on the positional relationship between the tag and the part arranged in each key frame, and displays the display position on the space line from the distance information. (Display frame) is determined.
- the space line execution unit 1103 generates an animation image in which components are arranged on the screen based on the display position on the space line determined by the tag / component distance calculation unit 1104.
- the animation playback apparatus is constructed on a computer as shown in FIG. 2 in the first embodiment. That is, the animation execution management unit 1101, the timeline execution unit 1102, the spaceline execution unit 1103, and the tag / part distance calculation unit 1104 read the animation reproduction program according to the gist of the present invention into the computer and execute it on the CPU 202. By doing so, it can be realized on the computer as a specific means in which hardware and software cooperate.
- the output device controller 205 executes an instruction selection screen, display of reproduction content, and audio output via the display 206 and the speaker 208, and via the input device controller 201 with an input device such as a remote controller not shown in FIG.
- An instruction to the animation execution management unit 1101, the timeline execution unit 1102, the spaceline execution unit 1103, and the tag / part distance calculation unit 1104 is executed.
- the storage medium 106 can be constructed on the storage area of the hard disk 203 and the memory 204 built in the computer as a standard or on the storage medium of an external storage device.
- FIG. 19 is a diagram showing an example of a reference point for defining the position on the screen.
- FIG. 19A shows the reference point of the cursor part
- FIG. 19B shows the reference point of the tag.
- the cursor part 1201 has a reference point at the center 1202 (the center of the cross) of the part.
- the position of this reference point on the screen is regarded as the position of the cursor part.
- the tag 1203 also has a reference point in the central portion 1204 of the tag.
- the initial position of the reference point may be the center or the center of gravity of the component, and the editor may be moved to any position as necessary.
- FIG. 20 is a diagram showing an outline of processing for determining the display position on the space line from the positional relationship between the component and the tag, and FIG. 20A shows the positional relationship between the cursor component and the tag.
- FIG. 20B shows how the display position (frame to be displayed) of the space line is determined based on the positional relationship of FIG.
- the cursor parts 1301 and tags 1302 and 1304 shown in FIG. 20A have the same positions as the cursor parts and tags shown in FIGS. 19A and 19B as reference points.
- the tag / part distance calculation means 1104 calculates the reference point of the cursor part 1301 and the reference of the tag 1302 from the coordinates on the screen of the cursor part 1301, the tag 1302 with the tag name “light”, and the tag 1304 with the tag name “dark”.
- a distance 1303 (hereinafter referred to as distance L) from the point and a distance 1305 (hereinafter referred to as distance D) between the reference point of the cursor part 1301 and the reference point of the tag 1304 are calculated.
- the tag / part distance calculation means 1104 assumes a point P1309 between the tag 1307 and the tag 1308 on the space line 1306 in which the tags 1302 and 1304 are defined, and a distance 1310 between the tag 1307 and the point P1309,
- the position of the point P1309 is calculated in which the ratio of the tag 1308 and the distance 1311 between the point P1309 matches the ratio of the distance L and the distance D calculated as described above.
- the position of the point P1309 obtained by this calculation is set as the display position on the space line.
- the display state (frame) at the position of the point P1309 is used for displaying the cursor part 1301.
- the case where the reference points of the tags 1302 and 1304 are arranged on a straight line with respect to the reference point of the cursor part 1301 is taken as an example, but the configuration is not arranged on the straight line. Even if it exists, a mutual distance can be calculated from the positional relationship between the cursor part 1301 and each tag 1302, 1304, and the display position on a space line can be determined based on this distance.
- the moving direction of the cursor part 1301 may not be parallel to the straight line passing through the tag 1302 and the tag 1304, and the display position on the space line can be determined based on the distance ratio between the cursor part and each tag. it can.
- the display position of the cursor part on the space line is calculated from the distance from the tag using the tag closest to the reference point of the cursor part, for example.
- the timeline data 108 that defines the layout of the keyframes, the arrangement of the tags in the keyframes, and the interpolation content between the keyframes, and the spaceline.
- Animation data 107 including space line data 109 having tag data 113 defining the layout of the key frame, the arrangement of the tags in the key frame, and the interpolation contents between the key frames, and defining the tags arranged in the space line.
- Storage medium 106 timeline execution means 1102 for generating an animation image based on the frame relationship defined by the timeline data 108 from the animation data 107, and the distance between the tag arranged in the frame and the animation part.
- An animation execution manager who inputs to the timeline execution unit 1102, the tag / part distance calculation unit 1104, and the spaceline execution unit 1103 and generates an animation image of the animation data 107 to be reproduced. Equipped with a 1101. With this configuration, it is possible to execute and display an animation in which the position of the part on the display screen is associated with the display state of the part.
- FIG. FIG. 21 is a block diagram showing a configuration of an animation editing apparatus according to Embodiment 3 of the present invention. As shown in FIG. 21, the animation editing apparatus according to the third embodiment is different from the space line editing means 103 and the tag editing means 104 in the configuration shown in FIG. Tag editing means 1402 is provided.
- the space plane editing unit 1401 generates a key frame, arranges parts on the key frame, arranges key frames on the space plane, and positions and parts between the key frames in accordance with the editing instruction via the animation editing management unit 101. Set the display contents interpolation.
- the tag editing unit 1402 defines a tag at the designated position on the space plane.
- the space plane is a two-dimensional plane mapping the relative positional relationship between the reference position indicated by the tag and the display position of the animation part placed in the key frame of other animation data.
- 109a is stored in the storage medium 106.
- the animation editing apparatus is constructed on a computer as shown in FIG. 2 in the first embodiment. That is, the animation editing management unit 101, the timeline editing unit 102, the space plane editing unit 1401, the tag editing unit 1402, and the tag placement unit 105 cause the CPU 202 to read an animation editing program according to the spirit of the present invention.
- the animation editing management unit 101 the timeline editing unit 102, the space plane editing unit 1401, the tag editing unit 1402, and the tag placement unit 105 cause the CPU 202 to read an animation editing program according to the spirit of the present invention.
- the output device controller 205 executes display of the instruction selection screen, editing results, etc. and voice output via the display 206 and the speaker 208, and input devices such as the mouse 200 and the keyboard 207 in response thereto.
- an editing instruction to the animation editing management unit 101, the timeline editing unit 102, the space plane editing unit 1401, the tag editing unit 1402, and the tag placement unit 105 is executed via the input device controller 201.
- the storage medium 106 can be constructed on the storage area of the hard disk 203 and the memory 204 built in the computer as a standard or on the storage medium of an external storage device.
- FIG. 22 is a diagram showing a main operation screen of the animation editing apparatus according to the third embodiment.
- a space plane display area 1501 displays information related to the space plane 1502.
- the space plane 1502 displays the state of the space plane of the animation data being edited.
- a key frame is arranged at the position of the key frame display 606 on the space plane 1502.
- a triangular area connecting the key frame displays 606 on the space plane 1502 with a straight line is an interpolable area 1503 between key frames.
- the display colors of red, blue, and green are set for the components of each key frame.
- the space plane editing unit 1401 linearly interpolates the display colors of the parts of each key frame in the interpolable area 1503. For this reason, the frame 605a that is designated by the display position cursor 605 and is located at the center of the interpolable area 1503 has a display color of gray in which red, blue, and green are linearly added.
- the tag editing unit 1402 adds a tag 610 with a tag name “red”, a tag 610 with a tag name “green”, and a tag 610 with a tag name “blue” to each key frame of the space plane 1502. Each is arranged. Note that the tag can also be arranged in the interpolable area 1503.
- FIG. 23 is a diagram showing a tag arrangement screen of the animation editing apparatus according to the third embodiment, and shows a case where tags are arranged on the frame display area 611 of the main operation screen 601.
- a tag is two-dimensionally arranged on the frame display area 611, and distance information between each tag and a component is calculated. From this distance information, a display position on the space plane 1502 in FIG. 22 is determined.
- the distance between the cursor part at the position of the “home” button in the selection display 1003 and the tag display of the tag name “red” arranged in the key frame, and the cursor part and the key frame are displayed.
- the distance between the tag display of the tag name “blue” closer to the arranged cursor and the distance between the cursor part and the tag display of the tag name “green” arranged in the key frame are used.
- the display position of the cursor part is determined at the same distance ratio position on the space plane 1502 in FIG.
- the timeline data 108 defining the layout of the keyframes, the arrangement of the tags in the keyframes, and the interpolation content between the keyframes.
- the animation data 107 including the space plane data 109a having the tag data 113 that defines the layout of the key frames, the arrangement of the tags in the key frames, and the interpolation contents between the key frames and the tags arranged in the space plane is stored.
- a timeline and a space plane defined by the timeline data 108 and the space plane data 109a included in the animation data 107 to be edited read from the storage medium 106, and based on these Accepting an editing instruction to display the frame contents, in accordance with the editing instruction inputted in response to the contents of the display, performing a time-line editing, space plane editing, editing tags and tag location.
- an animation playback device that plays back an animation edited by the animation editing device according to the third embodiment is in addition to the configuration of the storage medium 106 of FIG. 21 and the animation data 107 stored therein shown in the third embodiment.
- Animation execution management means timeline execution means, space plane execution means, and tag / part distance calculation means (distance calculation means).
- the animation execution management means reads the animation data 107 from the storage medium 106, executes the animation data, displays the resulting animation on the screen, and presents it to the user.
- an output device (not shown) such as a display or a speaker is used to output the execution result.
- the timeline executing means generates a key frame, generates parts on the key frame, and interpolates between the key frames based on the information on the time line, and generates an animation image that is sequentially displayed at the time of execution.
- the space plane executing means generates a key frame, generates parts on the key frame, and interpolates between the key frames based on the information of the space plane, and generates an animation image that is sequentially displayed at the time of execution.
- the tag / part distance calculation means calculates distance information between the tag and the part based on the positional relationship between the tag and the part arranged in each key frame, and displays the display position ( Display frame).
- the space plane execution means generates an animation image in which parts are arranged on the screen based on the display position on the space plane determined by the tag / part distance calculation means 1104.
- this animation playback device is constructed on a computer as shown in FIG. 2 in the first embodiment. That is, the animation execution management means, the timeline execution means, the space plane execution means, and the tag / part distance calculation means cause the CPU 202 to read the animation reproduction program according to the gist of the present invention and execute it by the CPU 202. It can be realized on the computer as a specific means in which hardware and software cooperate.
- the animation editing apparatus can easily edit the part position on the screen and the display state of the part at that position, and can easily understand the correspondence between the position and the display state from the editing result. Therefore, it is suitable for creating user interfaces and movie contents using animation.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
実施の形態1.
図1は、この発明の実施の形態1によるアニメーション編集装置の構成を示すブロック図である。図1において、アニメーション編集管理手段101は、記憶媒体106に対するアニメーションデータの読み出しや書き込み、編集者(ユーザ)からの編集指示の入力及び編集結果の編集者への提示を行う。編集指示の入力には、キーボードやマウス等の入力デバイス(不図示)を用いる。また、編集結果の出力には、ディスプレイやスピーカ等の出力デバイス(不図示)を用いる。
(A)タイムラインからキーフレームとしたい位置(空のフレーム又は補間フレーム)を選択してキーフレームを新規作成する。
(B)タイムラインに対応付けられていないキーフレームを新規作成し、タイムラインの所望の位置に挿入(対応付け)する。
キーフレームに配置したタグは、補間フレームにおいても引き継がれる。これにより、補間フレームを表示すると、直前のキーフレームと同じ位置にタグが配置されていることを確認することができる。
(1)動作の概要
図3は、実施の形態1によるアニメーション編集装置の動作を示すフローチャートであり、この図に沿ってアニメーション編集動作を説明する。
先ず、アニメーション編集装置が起動すると、アニメーション編集管理手段101が、外部からの編集指示を受け付ける入力待ち状態となる。ここで、編集者(ユーザ)からキーボードやマウス等の入力デバイスを用いて編集指示が入力されると、アニメーション編集管理手段101が、入力された編集指示情報を受け取る(ステップST1;編集指示入力ステップ)。
ただし、同一の部品の同一のパラメータをタイムラインとスペースラインとの両方から変化させることはできない。例えば、画面上の部品のX座標をタイムラインで変化(画面のX軸方向に移動)させながら、同じ部品のX座標をスペースラインで変化させる等の処理は、タイムラインで決定したパラメータ(例えば、画面のX座標)とスペースラインで決定したパラメータのどちらをX座標として使用するかを決められないため、両立できない。
図5は、図3中の編集処理ステップにおける処理の流れを示すフローチャートであり、この図に沿って処理の概要を説明する。
編集処理ステップにおいて、アニメーション編集管理手段101は、図4に示す主操作画面601を表示して編集指示待ち状態となる。ここで、入力デバイス(不図示)を用いて、主操作画面601のいずれかの項目について編集指示があると、アニメーション編集管理手段101は、編集指示があった項目に対応する編集処理を判定する(ステップST1a)。
図6は、実施の形態1によるアニメーション編集装置のタイムライン編集処理の流れを示すフローチャートであり、図5のステップST5aにおける処理の詳細を示している。この図に沿って処理の詳細を説明する。タイムライン編集手段102は、アニメーション編集管理手段101から入力した編集内容に応じて編集処理を判定し(ステップST1b)、対応する編集処理に分岐する。
図9は、実施の形態1によるアニメーション編集装置のスペースライン編集処理の流れを示すフローチャートであり、図5のステップST6aにおける処理の詳細を示している。スペースライン編集手段103は、アニメーション編集管理手段101から入力した編集内容に応じて編集処理を判定し(ステップST1c)、対応する編集処理に分岐する。なお、ステップST2c、ステップST3c、ステップST4c及びステップST6cのキーフレームの編集処理は、上述したタイムライン編集処理と同様であるので説明を省略する。
図11は、実施の形態1によるアニメーション編集装置のタグ配置処理の具体例を示す図であり、図4中の主操作画面にタグ配置の編集操作を施した画面を示している。
主操作画面601において、スペースラインで表示状態が規定された部品(図11の例では、カーソル部品)を選択してタグ配置の編集指示操作を施すと、アニメーション編集管理手段101によってタグ配置手段105が起動し、図11に示す画面に切り替わる。選択された部品(カーソル部品)は、選択中表示1003になる。
図18は、この発明の実施の形態2によるアニメーション再生装置の構成を示すブロック図である。図18に示すように、実施の形態2によるアニメーション再生装置は、上記実施の形態1で示した図1の記憶媒体106及びそこに記憶されたアニメーションデータ107の構成に加え、アニメーション実行管理手段1101、タイムライン実行手段1102、スペースライン実行手段1103及びタグ・部品距離算出手段(距離算出手段)1104を備える。
ここでは、画面に配置されたタグとタグを定義した部品との位置関係から、スペースライン上の表示位置を決定する処理を詳細に説明する。
図19は、画面上の位置を規定する際の基準点の一例を示す図であり、図19(a)はカーソル部品の基準点、図19(b)はタグの基準点を示している。図19(a)に示すように、カーソル部品1201は、部品の中央部1202(十字の中心)に基準点を有する。画面上におけるこの基準点の位置を、カーソル部品の位置とみなす。また、図19(b)に示すように、タグ1203も、タグの中央部1204に基準点を有する。なお、ここでは部品やタグの中央部を基準点としているが、基準点の初期位置を部品の中央や重心位置とし、必要に応じて編集者が任意の位置に移動できる構成としてもよい。
図21は、この発明の実施の形態3によるアニメーション編集装置の構成を示すブロック図である。図21に示すように、実施の形態3によるアニメーション編集装置は、上記実施の形態1で示した図1の構成におけるスペースライン編集手段103及びタグ編集手段104の代わりに、スペースプレーン編集手段1401及びタグ編集手段1402を備える。
図22は、実施の形態3によるアニメーション編集装置の主操作画面を示す図である。図22において、スペースプレーン表示エリア1501は、スペースプレーン1502に関連する情報を表示する。スペースプレーン1502は、編集中のアニメーションデータのスペースプレーンの状態を表示する。
Claims (4)
- アニメーションを構成する各フレームのデータからなるアニメーションデータに基づき当該アニメーションを編集するアニメーション編集装置において、
フレームの時間的な表示順を示すタイムラインに基づいて、前記アニメーションの基準となるキーフレームのレイアウト、前記キーフレームにおけるタグの配置及び前記キーフレーム間の補間内容が規定されたタイムラインデータと、アニメーション部品の表示位置とタグで示された基準位置との相対的な位置関係を一次元の直線にマッピングして示すスペースラインに基づいて、前記キーフレームのレイアウト、前記キーフレームにおけるタグの配置及び前記キーフレーム間の補間内容が規定され、前記スペースラインに配置されたタグを規定するタグデータを有するスペースラインデータとを含む前記アニメーションデータを記憶するデータ記憶手段と、
入力された編集指示に従って、前記データ記憶手段から読み出されたタイムラインデータに対して、前記タイムラインに配置するキーフレームの作成、前記タイムラインへのキーフレームの配置及び前記タイムラインのキーフレーム間の補間内容の設定を行うタイムライン編集手段と、
入力された編集指示に従って、前記データ記憶手段から読み出されたスペースラインデータに対して、前記スペースラインに配置するキーフレームの作成、前記スペースラインへのキーフレームの配置及び前記スペースラインのキーフレーム間の補間内容の設定を行うスペースライン編集手段と、
入力された編集指示に従って、前記データ記憶手段から読み出されたスペースラインデータで規定されるスペースラインの前記編集指示された位置にタグを設定するタグ編集手段と、
入力された編集指示に従って、前記アニメーションデータで規定されるキーフレームの前記編集指示された位置にタグを設定するタグ配置手段と、
前記データ記憶手段から読み出した編集対象のアニメーションデータに含まれるタイムラインデータ及びスペースラインデータで規定されるタイムライン、スペースライン及びこれらに基づくフレーム内容を提示して編集指示を受け付け、当該提示内容に応じて入力された編集指示に従って、前記タイムライン編集手段、前記スペースライン編集手段、前記タグ編集手段及び前記タグ配置手段に編集処理を実行させ、アニメーション編集結果を提示するアニメーション編集管理手段とを備えたことを特徴とするアニメーション編集装置。 - アニメーションを構成する各フレームのデータからなるアニメーションデータに基づき当該アニメーションを編集するアニメーション編集装置において、
フレームの時間的な表示順を示すタイムラインに基づいて、前記アニメーションの基準となるキーフレームのレイアウト、前記キーフレームにおけるタグの配置及び前記キーフレーム間の補間内容が規定されたタイムラインデータと、アニメーション部品の表示位置とタグで示された基準位置との相対的な位置関係を二次元の平面にマッピングして示すスペースプレーンに基づいて、前記キーフレームのレイアウト、前記キーフレームにおけるタグの配置及び前記キーフレーム間の補間内容が規定され、前記スペースプレーンに配置されたタグを規定するタグデータを有するスペースプレーンデータとを含む前記アニメーションデータを記憶するデータ記憶手段と、
入力された編集指示に従って、前記データ記憶手段から読み出されたタイムラインデータに対して、前記タイムラインに配置するキーフレームの作成、前記タイムラインへのキーフレームの配置及び前記タイムラインのキーフレーム間の補間内容の設定を行うタイムライン編集手段と、
入力された編集指示に従って、前記データ記憶手段から読み出されたスペースプレーンデータに対して、前記スペースプレーンに配置するキーフレームの作成、前記スペースプレーンへのキーフレームの配置及び前記スペースプレーンのキーフレーム間の補間内容の設定を行うスペースプレーン編集手段と、
入力された編集指示に従って、前記データ記憶手段から読み出されたスペースプレーンデータに対して、前記スペースプレーンの前記編集指示された位置にタグを設定するタグ編集手段と、
入力された編集指示に従って、前記データ記憶手段から読み出されたアニメーションデータで規定されるキーフレームの前記編集指示された位置にタグを設定するタグ配置手段と、
前記データ記憶手段から読み出した編集対象のアニメーションデータに含まれるタイムラインデータ及びスペースプレーンデータで規定されるタイムライン、スペースプレーン及びこれらに基づくフレーム内容を提示して編集指示を受け付け、当該提示内容に応じて入力された編集指示に従って、前記タイムライン編集手段、前記スペースプレーン編集手段、前記タグ編集手段及び前記タグ配置手段に編集処理を実行させ、アニメーション編集結果を提示するアニメーション編集管理手段とを備えたことを特徴とするアニメーション編集装置。 - アニメーションを構成する各フレームのデータからなるアニメーションデータを再生するアニメーション再生装置において、
フレームの時間的な表示順を示すタイムラインに基づいて、前記アニメーションの基準となるキーフレームのレイアウト、前記キーフレームにおけるタグの配置及び前記キーフレーム間の補間内容が規定されたタイムラインデータと、アニメーション部品の表示位置とタグで示された基準位置との相対的な位置関係を一次元の直線にマッピングして示すスペースラインに基づいて、前記キーフレームのレイアウト、前記キーフレームにおけるタグの配置及び前記キーフレーム間の補間内容が規定され、前記スペースラインに配置されたタグを規定するタグデータを有するスペースラインデータとを含む前記アニメーションデータを記憶するデータ記憶手段と、
前記データ記憶手段から読み出されたアニメーションデータから、前記タイムラインデータで規定されるフレーム関係に基づいたアニメーション画像を生成するタイムライン実行手段と、
フレームに配置されたタグとアニメーション部品との距離を算出し、当該距離に基づく前記フレーム内の位置関係に対応した前記アニメーション部品の前記スペースライン上の表示位置を決定する距離算出手段と、
前記データ記憶手段から読み出されたアニメーションデータから、前記スペースラインデータで規定されるフレーム関係に基づき、前記距離算出手段が決定した前記スペースライン上の表示位置に対応するアニメーション部品のアニメーション画像を生成するスペースライン実行手段と、
前記データ記憶手段から読み出した再生対象のアニメーションデータを、前記タイムライン実行手段、前記距離算出手段及び前記スペースライン実行手段に入力して、前記再生対象のアニメーションデータのアニメーション画像の生成を実行させるアニメーション実行管理手段とを備えたことを特徴とするアニメーション再生装置。 - アニメーションを構成する各フレームのデータからなるアニメーションデータを再生するアニメーション再生装置において、
フレームの時間的な表示順を示すタイムラインに基づいて、前記アニメーションの基準となるキーフレームのレイアウト、前記キーフレームにおけるタグの配置及び前記キーフレーム間の補間内容が規定されたタイムラインデータと、アニメーション部品の表示位置とタグで示された基準位置との相対的な位置関係を二次元の平面にマッピングして示すスペースプレーンに基づいて、前記キーフレームのレイアウト、前記キーフレームにおけるタグの配置及び前記キーフレーム間の補間内容が規定され、前記スペースプレーンに配置されたタグを規定するタグデータを有するスペースプレーンデータとを含む前記アニメーションデータを記憶するデータ記憶手段と、
前記データ記憶手段から読み出されたアニメーションデータから、前記タイムラインデータで規定されるフレーム関係に基づいたアニメーション画像を生成するタイムライン実行手段と、
フレームに配置されたタグとアニメーション部品との距離を算出し、当該距離に基づく前記フレーム内の位置関係に対応した前記アニメーション部品の前記スペースプレーン上の表示位置を決定する距離算出手段と、
前記データ記憶手段から読み出されたアニメーションデータから、前記スペースプレーンデータで規定されるフレーム関係に基づき、前記距離算出手段で決定された前記スペースプレーン上の表示位置に対応するアニメーション部品のアニメーション画像を生成するスペースプレーン実行手段と、
前記データ記憶手段から読み出した再生対象のアニメーションデータを、前記タイムライン実行手段、前記距離算出手段及び前記スペースプレーン実行手段に入力して、前記再生対象のアニメーションデータのアニメーション画像の生成を実行させるアニメーション実行管理手段とを備えたことを特徴とするアニメーション再生装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112009004615T DE112009004615T5 (de) | 2009-03-31 | 2009-03-31 | Animationsbearbeitungsvorrichtung und Animationswiedergabevorrichtung |
JP2011506843A JP5063810B2 (ja) | 2009-03-31 | 2009-03-31 | アニメーション編集装置、アニメーション再生装置及びアニメーション編集方法 |
CN2009801581061A CN102356407B (zh) | 2009-03-31 | 2009-03-31 | 动画编辑装置、动画再现装置以及动画编辑方法 |
PCT/JP2009/001497 WO2010113211A1 (ja) | 2009-03-31 | 2009-03-31 | アニメーション編集装置及びアニメーション再生装置 |
US13/146,660 US8786612B2 (en) | 2009-03-31 | 2009-03-31 | Animation editing device, animation playback device and animation editing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2009/001497 WO2010113211A1 (ja) | 2009-03-31 | 2009-03-31 | アニメーション編集装置及びアニメーション再生装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010113211A1 true WO2010113211A1 (ja) | 2010-10-07 |
Family
ID=42827545
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/001497 WO2010113211A1 (ja) | 2009-03-31 | 2009-03-31 | アニメーション編集装置及びアニメーション再生装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8786612B2 (ja) |
JP (1) | JP5063810B2 (ja) |
CN (1) | CN102356407B (ja) |
DE (1) | DE112009004615T5 (ja) |
WO (1) | WO2010113211A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103208131A (zh) * | 2012-08-28 | 2013-07-17 | 北京中盈高科信息技术有限公司 | 一种三维动画的制作、播放、修改方法和电子终端 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9098186B1 (en) | 2012-04-05 | 2015-08-04 | Amazon Technologies, Inc. | Straight line gesture recognition and rendering |
US9373049B1 (en) * | 2012-04-05 | 2016-06-21 | Amazon Technologies, Inc. | Straight line gesture recognition and rendering |
TWI606418B (zh) * | 2012-09-28 | 2017-11-21 | 輝達公司 | 圖形處理單元驅動程式產生內插的圖框之電腦系統及方法 |
CN110428485A (zh) * | 2019-07-31 | 2019-11-08 | 网易(杭州)网络有限公司 | 二维动画编辑方法及装置、电子设备、存储介质 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0696186A (ja) * | 1992-09-10 | 1994-04-08 | Fujitsu Ltd | 時刻/属性値により変化する図形の編集装置 |
JPH0935083A (ja) * | 1995-07-24 | 1997-02-07 | Hitachi Ltd | アニメーション編集装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5926186A (en) | 1992-09-10 | 1999-07-20 | Fujitsu Limited | Graphic editing apparatus and method |
US6310622B1 (en) * | 1998-04-07 | 2001-10-30 | Adobe Systems Incorporated | Automatic graphical pattern placement and adjustment |
JP4114720B2 (ja) * | 2002-10-25 | 2008-07-09 | 株式会社ソニー・コンピュータエンタテインメント | 画像生成方法および画像生成装置 |
JP4772455B2 (ja) | 2005-10-26 | 2011-09-14 | 和久 下平 | アニメーション編集システム |
US8170380B1 (en) * | 2008-05-30 | 2012-05-01 | Adobe Systems Incorporated | Method and apparatus for importing, exporting and determining an initial state for files having multiple layers |
-
2009
- 2009-03-31 DE DE112009004615T patent/DE112009004615T5/de not_active Withdrawn
- 2009-03-31 US US13/146,660 patent/US8786612B2/en active Active
- 2009-03-31 WO PCT/JP2009/001497 patent/WO2010113211A1/ja active Application Filing
- 2009-03-31 JP JP2011506843A patent/JP5063810B2/ja not_active Expired - Fee Related
- 2009-03-31 CN CN2009801581061A patent/CN102356407B/zh not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0696186A (ja) * | 1992-09-10 | 1994-04-08 | Fujitsu Ltd | 時刻/属性値により変化する図形の編集装置 |
JPH0935083A (ja) * | 1995-07-24 | 1997-02-07 | Hitachi Ltd | アニメーション編集装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103208131A (zh) * | 2012-08-28 | 2013-07-17 | 北京中盈高科信息技术有限公司 | 一种三维动画的制作、播放、修改方法和电子终端 |
Also Published As
Publication number | Publication date |
---|---|
CN102356407B (zh) | 2013-09-25 |
CN102356407A (zh) | 2012-02-15 |
US20110298809A1 (en) | 2011-12-08 |
JP5063810B2 (ja) | 2012-10-31 |
JPWO2010113211A1 (ja) | 2012-10-04 |
US8786612B2 (en) | 2014-07-22 |
DE112009004615T5 (de) | 2012-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101037864B1 (ko) | 복수의 미디어 객체들에서 사용하기 위한 특징을 생성하는방법 및 소프트웨어 프로그램 | |
US8589871B2 (en) | Metadata plug-in application programming interface | |
RU2378698C2 (ru) | Модель определения ключевого кадра атрибута сопряженных объектов | |
CN101207717B (zh) | 组织产生运动图像的模板的系统和方法 | |
KR100989459B1 (ko) | 비디오 프레임 시퀀스를 제공하는 장치 및 방법, 장면 모델을 제공하는 장치 및 방법, 장면 모델, 메뉴 구조를 생성하는 장치 및 방법 그리고 컴퓨터 프로그램 | |
US6072479A (en) | Multimedia scenario editor calculating estimated size and cost | |
US7761796B2 (en) | Animation on object user interface | |
US8161452B2 (en) | Software cinema | |
JP5372518B2 (ja) | 相互対話的な電子的にシミュレートされた環境の音声および映像制御 | |
KR20080100434A (ko) | 콘텐츠 액세스 트리 | |
JP5063810B2 (ja) | アニメーション編集装置、アニメーション再生装置及びアニメーション編集方法 | |
US20060181545A1 (en) | Computer based system for selecting digital media frames | |
US20070182740A1 (en) | Information processing method, information processor, recording medium, and program | |
JP4845975B2 (ja) | 映像フレームのシーケンスを提供するための装置および方法、シーンモデルを提供するための装置および方法、シーンモデル、メニュー構造を作成するための装置および方法およびコンピュータ・プログラム | |
KR101118536B1 (ko) | 상호 작용이 가능한 콘텐츠 저작 수단을 제공하는 방법 | |
JP2011150568A (ja) | 生産準備活動プログラム | |
JP4967983B2 (ja) | 情報記録装置及びプログラム | |
JP2009069684A (ja) | 画像表示装置、画像表示方法、画像表示プログラム | |
TW202044020A (zh) | 多媒體影像前端編輯之批次處理方法與電腦程式產品 | |
JP2006033388A (ja) | 表示プログラム、表示方法、表示装置及び記録媒体 | |
JP2009086948A (ja) | スライドショーデータ作成装置、スライドショーデータ作成方法およびコンピュータプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980158106.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09842568 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2011506843 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13146660 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112009004615 Country of ref document: DE Ref document number: 1120090046150 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09842568 Country of ref document: EP Kind code of ref document: A1 |