WO2005071616A1 - 画像作成装置および画像作成方法 - Google Patents
画像作成装置および画像作成方法 Download PDFInfo
- Publication number
- WO2005071616A1 WO2005071616A1 PCT/JP2005/000989 JP2005000989W WO2005071616A1 WO 2005071616 A1 WO2005071616 A1 WO 2005071616A1 JP 2005000989 W JP2005000989 W JP 2005000989W WO 2005071616 A1 WO2005071616 A1 WO 2005071616A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- character
- input
- character string
- name
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
Definitions
- the present invention relates to an image creating apparatus and an image creating method for creating computer 'graphics from text.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2001-307137
- the engine for searching for material data and the engine for creating computer 'graphics are different. Therefore, the search engine is activated to search for material data, and then the material data is searched, and then the computer's engine for creating graphics is reactivated, and the computer 'graphics using the searched material data Process of creating Need to move to For this reason, it is possible that the scenario considered before the search is forgotten during the search process and the scenario considered is recalled when the computer's processing for creating the graphics is performed again. In addition, when you move on to the process of creating computer graphics, you may even forget what the retrieved material data was.
- the retrieved material data may be frequently used by the user.
- the searched material data since the searched material data is not managed, there is a problem that the same material data is searched many times.
- An object of the present invention is to provide an image forming apparatus and an image forming method capable of easily using searched material data as well as performing a computer 'graphics creation process and a search of material data by a series of operations. To provide.
- a character data in which material data for creating computer's graphics are associated with a set of material names of the material data], a material correspondence table, and a feature hierarchy of material data A hierarchical structure description to be described is provided, and when there is a feature input, material data corresponding to the feature is searched using the hierarchical structure description, and the searched material data is stored and searched.
- the material name of the material data and material data is registered in the text IJ ⁇ material correspondence table, and when the material name is input, the material data corresponding to the material name is acquired using the character string ⁇ material correspondence table.
- the computer's graphics were made using acquired material data.
- the computer 'graphics creation process and the search of material data can be performed by a series of operations, and the searched material data can be easily used thereafter.
- FIG. 1 is a block diagram of an image creating apparatus according to an embodiment of the present invention.
- FIG. 2 A first diagram showing a description example of character data to be applied to the present embodiment.
- FIG. 3 A second diagram showing a description example of character data to be applied to the present embodiment.
- FIG. 6 A block diagram of the character, character correspondence table of force, light according to the present embodiment.
- FIG. 8 Power of the present embodiment, a block diagram of the character ⁇ IJ ⁇ set correspondence table
- FIG. 10 A diagram showing an example of description of search results for a metadata database according to the present embodiment.
- Plane 11 A first diagram showing the concept of hierarchical structure description according to the present embodiment.
- FIG. 12 A first diagram showing an example of description of hierarchical structure description according to the present embodiment.
- FIG. 13 A second diagram showing a description example of the hierarchical structure description according to the present embodiment.
- FIG. 15 A diagram showing a GUI displayed at the time of creating an animation that contributes to the present embodiment.
- FIG. 16 Configuration of GUI displayed when searching for material data that contributes to the present embodiment
- FIG. 17 Flow chart of the image creating apparatus scenario determination process according to the present embodiment 18] Flow chart of set determination process of the image creating apparatus according to the present embodiment solid 19) Image creation according to the present mode Diagram for explaining the set determination process of the device
- FIG. 20 Flow chart of character determination process of image forming apparatus according to the present embodiment 21] A figure for explaining character determination process of the image forming apparatus according to the present embodiment.
- FIG. 22 is a flowchart of operation determination processing of the image generation apparatus according to the present embodiment.
- FIG. 23 is a diagram for explaining operation determination processing of the image generation device according to the present embodiment. The flow of object determination processing of the image creation device
- FIG. 25 A diagram for explaining the object determining process of the image creating apparatus according to the present embodiment.
- FIG. 26 A diagram for explaining scenario input of the image creating apparatus according to the present embodiment.
- FIG. 27 A diagram for explaining the relationship of the objects that contributes to the present embodiment.
- FIG. 28 A flowchart of a search process of an image creation apparatus according to the present embodiment
- FIG. 29 A flowchart of material data search processing of an image creating apparatus according to the present embodiment.
- FIG. 30 A first diagram for explaining material data search processing of an image creating apparatus according to the present embodiment.
- FIG. 31 A second diagram for explaining the material data search process of the image creation device according to the present embodiment.
- FIG. 1 is a block diagram of an image forming apparatus according to the present embodiment.
- the image creating apparatus 100 is provided with a CPU 101 that controls the entire apparatus.
- the CPU 101 also operates as a processing unit having various functions by loading a program stored in the HDD 102 into the RAM 103 and executing the program.
- the image creating apparatus 100 is provided with an input unit 104 such as a keyboard and a mouse. Further, the image creating apparatus 100 is provided with a monitor 105 as a display means.
- a character data database 108 storing character data 112, animation data of computer 'graphics, an action data database 109 storing action data 113, and set data 114 are provided.
- a transmitting / receiving unit 106 is provided which receives character data, operation data and set data via the stored set data database 110, the Internet 107. The transmitting and receiving unit 106 also receives metadata from the metadata database 111.
- the image creating apparatus 100 further includes a character data storage unit 115 storing a plurality of character data 112 acquired via the Internet 107 and an operation for storing a plurality of operation data 113 acquired via the Internet 107.
- a data storage unit 116 and a set data storage unit 117 for storing a plurality of set data 114 acquired via the Internet 107 are provided.
- Metadata is material data, and character data 112, motion data 113, and Metadata for explaining the set data 114.
- the metadata describes the feature of the corresponding material data, the thumbnail image, the link address to the material data, and the like. The details of the metadata will be described later.
- the character data 112 is a group of parameters such as vertex coordinates of the character.
- the motion data 113 is a group of parameters for causing the character to perform a predetermined motion.
- the parameters include rotation matrix and movement matrix.
- the set data 114 is a group of parameters such as vertex coordinates of set data which is information of a set such as an object operable on a place or a character. The details of the character data 112, the operation data 113, and the set data 114 will be described later.
- Character data 112 is associated with a predetermined character string (subject noun character string).
- the character string and character correspondence table storage unit 119 stores the character string and character correspondence table 118 in which the character string and the character data 112 are stored in association with each other. Correspondence with the character data 112 to be
- the image creating apparatus 100 is provided with a character / action correspondence table storage unit 121 for storing a character / action correspondence table 120 for obtaining correspondence between the character data 112 and the action data 113.
- a character string / set correspondence table for obtaining correspondence between a predetermined character string (place name character string), the set data 114, and an object included in the set data 114.
- a character set for storing a set correspondence table storage unit 123 is provided.
- the operation data 113 includes an operation using an object included in the set data 114 and an operation not using the object. Therefore, the image creating apparatus 100 is provided with a motion dictionary storage unit 125 storing a motion dictionary 124 in which information on whether or not the verb data 113 uses an object is described.
- the image generation apparatus 100 is provided with an input control unit 126.
- Input control unit 12 6 extracts a place name character string representing a place, a subject noun character string representing a subject, and a verb character string representing an action from the scenario input to the input unit 104, and sends it to the character string * CG conversion processing unit 131.
- the character string 'CG conversion processing unit 131 refers to the character / one-character correspondence table 118, the character / action correspondence table 120, the character string' set correspondence table 122, and the action dictionary 124, and transmits from the input control unit 126. Character data 112, action data 113, and set data 114 corresponding to the received character string are selected. Then, the character string 'CG conversion processing unit 131 selects the selected character data.
- operation data 113 and set data 114 are sent to the display control unit 127.
- the display control unit 127 creates computer's graphics based on the sent character data 112, operation data 113, and set data 114, and displays it on the monitor 105.
- the image creation device 100 is provided with a search unit 128.
- the search unit 128 acquires a plurality of metadata stored in the metadata database 111, extracts features from the acquired plurality of metadata, and manages it as a hierarchical structure description 130. Details of the hierarchical structure description 130 will be described later.
- the search unit 128 refers to the hierarchical structure description 130 and detects a feature whose character string sent from the input control unit 126 partially matches. Then, the search unit 128 sends the detected feature to the display control unit 127.
- the search unit 128 sends information related to the character data 112, the operation data 113, and the set data 114 corresponding to the feature selected from the list of features sent from the input control unit 126 to the registration unit 129.
- the registration unit 129 receives the sent character data 112, the operation data 113, and the set data 1.
- the image creating apparatus 100 is configured.
- FIGS. 2 and 3 are diagrams showing an example of description of the character data 112.
- the character data 112 is described in BAC format (Ver. 6.0), but may be other description examples.
- the character data 112 includes coordinate values indicating the basic posture of the character as indicated by 201 in the figure, the type of character display method as indicated by 202 in the figure, and the character type as indicated by 203 in the figure. And the data of polygons used for the character as shown in FIG.
- the description method of the set data 114 is also the same as the description method of the character data 112, and thus the detailed description will be omitted.
- FIG. 4 and FIG. 4 and 5 show examples of description of the operation data 113.
- FIG. In the present embodiment, the operation data 113 is described in TRA format (Ver. 4.0), but may be other description examples.
- Movement data 113 includes movement in each coordinate direction as indicated by 401a-401 in the figure, enlargement / reduction with respect to each coordinate direction as indicated by 402a-402f in the figure, and each as indicated by 403a-403f in the figure.
- the vector to the direction, the rotation angle of the center as shown in 404a-404f in the figure, is described and re-.
- Character] character correspondence table 118 represents the features of subject noun character string 601 which is the name of character data 112, link information 602 to character data 112 corresponding to subject noun character string 601, and character data 112. It is a table that has multiple sets of character adjective strings 603.
- link information 602 to a plurality of character data 112 is associated with one subject noun character string 601. That is, one subject noun string 601 has at least one or more character data 112.
- the character string 'CG conversion processing unit 131 refers to the character-to-character correspondence table 118, and the subject noun character string 601 corresponding to the character string input from the input unit 104. It is possible to easily detect whether or not the subject noun string 601 is sent to the display control unit 127. In addition, the character string 'CG conversion processing unit 131 can easily detect the acquisition destination of the character data 112 for the input character string (subject noun character string 601) by referring to the character ⁇ IJ' character correspondence table 118. .
- FIG. 7 is a configuration diagram of a character's operation correspondence table 120 that is applied to the present embodiment.
- Character 'action correspondence table 120 includes character data name 701, verb string 702, expression name string 703 corresponding to verb string 702, and action data 113 corresponding to verb string 702. It is a table having a plurality of sets of link information 704 and.
- a plurality of verb strings 702 are associated with one character data name 701. That is, one character data 112 has at least one or more operation data 113.
- link information 704 to a plurality of action data is associated with one verb string 702.
- one or more expression name strings 703 may be associated with one verb string 702.
- action data 113 having at least one expression is associated with one verb string 702.
- Character string 'CG conversion processing section 131 easily detects whether there is a verb string 702 corresponding to an action that the character input from input section 104 can perform by referring to character' action correspondence table 120.
- the list of verb strings 702 can be sent to the display control unit 127.
- the character string 'CG conversion processing unit 131 can easily detect the acquisition destination of the operation data 113 for the input verb string 702 by referring to the character' operation correspondence table 120.
- FIG. 8 is a block diagram of the character / set correspondence table 122. As shown in FIG. 8
- the character string 'set correspondence table 122 includes objects of the place name character string 801, the link information 802 of the set data 114 corresponding to the place name character string 801, and the set data 114 corresponding to the place name character string 801.
- Name 803 represents the feature of the object indicated by object name 803 It is a table having a plurality of combinations of an object adjective character string 804 and an object parameter 805 such as the position, size, and action range of an object.
- link information 802 to a plurality of set data is associated with one place name character string 801. That is, one place name string 802 has at least one set data 114 or more.
- a plurality of object names 803 are associated with link information 802 to one set data 114. That is, at least one object 803 is associated with one set data 114.
- String “CG conversion processing unit 131 can easily detect whether the location name character string 801 input from the input unit 104 can be used or not by referring to the string / set correspondence table 122, and the location name A list of character strings 801 can be sent to the display control unit 127.
- the character string 'CG conversion processing unit 131 can easily detect the acquisition destination of the set data 114 for the input location name character string by referring to the character / set correspondence table 122.
- FIG. 9 is a block diagram of the action dictionary 124. As shown in FIG.
- the action dictionary 124 describes information on whether a plurality of verb strings and a verb string require an object.
- a verb string describing "cow” for example, "sit on” means that an object is required behind it.
- a verb string for example, rsand up in which "*" is described means that an object is required in between.
- a verb string (for example, “made up”) in which "hi” and "*" are written means that an object is required between and after.
- the character string 'CG conversion processing unit 131 can easily detect whether the input verb unit 104 needs an object or not and the position thereof by referring to the motion dictionary 124. Information as to whether or not an object is required can be sent to the display control unit 127.
- FIG. 10 is a diagram showing an example of description of search results in the metadata database.
- the search result 1000 for the metadata database shown in FIG. 10 is described in XML based on the element “SearchRe sultsj.
- the element“ SearchResult ” is composed of a plurality of“ item ”elements. .
- metadata 1006 a link address to a location where a thumbnail of material data for metadata 1006 is stored is described.
- the metadata 1006 describes a link address to a location where information of material data for the metadata 1006 is stored.
- the metadata 1006 describes a link address to a place where the material data for the metadata 1006 is stored.
- the metadata 1006 describes the feature of the material data with respect to the metadata 1006.
- the feature 1005 is described by an attribute (type) and a value (value).
- the attribute "061 ⁇ 6" value " ⁇ 11111 &11”
- the attribute "36" value Feature information defined by a pair of “Blue” and attribute “Ag ej value“ Young ”is described.
- the metadata 1006 describes the reference destination regarding the storage location of the material data, the reference destination of the thumbnail indicating the material data, and the information indicating the feature of the material data.
- FIG. 11 is a view showing the concept of the hierarchical structure description 130 for the character data 112 and the set data 114.
- the conceptual model 1100 of the hierarchical structure description 130 is an attribute “Key” of the upper concept (parent element) and an attribute “Key” of the lower concept (child element) and a value “Term” below the value “Term”. Is described. That is, the conceptual model 1100 of the hierarchical structure description 130 hierarchically describes a plurality of attributes "Key” and values "Term”. For example, in the conceptual model 1100 of the hierarchical structure description 130 shown in FIG. 11, the topmost “Contentj and Re, Key 1101 Force S Character Character”, “Picture” and “Sound” and several Term 1102 a—I have an 1102c.
- Term “: Human” 1104a is a key “Sex” 1105a, Key “Age” of a plurality of child elements.
- Key “Wear” 1105 c has a plurality of child elements Term “Shirt” 1106 a, Term “Tro user” 1106 b, and Term “Glasses” 1106 c.
- Trouser 1106b is a key of a plurality of child elements "Trouser. ColorJ 1107a
- Key “Trouser. Color” 1107 a has a plurality of Term “Blue” 1108 a, Term “Gra y” 1108 b, Term “Green” 1108 c, and Term “Brown” 1108 d.
- the conceptual model 1100 of the hierarchical structure description 130 manages the features of the material data by hierarchically managing a plurality of attributes “Key” and values “Term”.
- FIG. 12 shows an example of description of the hierarchical structure description 130.
- the description example 1200 corresponds to the conceptual model 1100 of the vocabulary management table shown in FIG.
- the description example 12 00 is an example of the hierarchical structure description 130
- the following description example 1200 will be described as the hierarchical structure description 120.
- the hierarchical structure description 1200 shown in FIG. 12 describes the concept model 1100 of the vocabulary management table shown in FIG. 11 in XML. Note that for convenience of explanation, the vocabulary management table 1200 and the conceptual model 1100 of the vocabulary management table completely coincide with each other.
- the vocabulary management table 1200 has the top-level Key “Contentj 1201 has a plurality of Term“ Characters ”1202 a,“ Picture ”1202 b, and“ Sound ”1202 c.
- Term “Human” 1204 a has a plurality of child elements Key “Sex” 1205 a, Key “Age”
- Key “Wear” 1205c has a plurality of child elements Term “Shirt” 1206a, Term “Tro user” 1206b, and Term “Glasses” 1206c.
- Term “Trouser” 1206 b has a child element Key “Trouser. Color” 1207 a.
- Key “Trouser. Color” 1207a has a plurality of Term “: Blue” 1208a, Term “Gray” 1208b, Term “Green” 1208c, and Term “Brown” 1208d.
- Term “Picture” 1202 b there is a child element “Key“ Format ”1209, and for the K ey“ Format ”1209, a plurality of child elements Term“ JPEG ”1210 a and Term“ GIF ”1210 b And the term "PNG” 1210c.
- Term “JPEG” 1210a has a child element Key as shown in 1211 and, further, As shown in 1212, eyl 211 has a plurality of child element terms.
- 0 111 & '1213 has multiple child elements Ding 61' 11112 14 & 12146.
- the term “MP3” 1214d has a link destination in which the key or the like of the lower child element is described.
- the link destination 1215 has a Key “Bit Rate” 1301 as shown in FIG. 13, and the Key “Bit Rate j 1301 has a Term of a plurality of child elements as shown in 1302.
- the description example 1200 can describe the lower description of the term “MP3” 1214 d using the link destination, that is, the other description. In this way, it is possible to separately describe in multiple parts of the vocabulary management table and manage each description separately.
- the hierarchical structure description 130 hierarchically manages the attribute “Key” and the value “Term” which are the features of the material data. This makes it possible to search the material data roughly using the upper concept vocabulary and then search the material data in detail using the lower vocabulary S.
- the conceptual model 4000 of the hierarchical structure description 130 for the operation data 113 has a plurality of attributes “Key” and values !!! ”in the same manner as the conceptual model 1100 shown in FIG. Is described.
- the topmost “Content” and the key “Keyl lOl” have “Motion” and a plurality of terms “Terml l 02d”.
- 13 ⁇ 4 “? ⁇ 011_0 6 ( ⁇ 011 & 1” 11091) is a child element of 111 “31 & ( ⁇ 13 ⁇ 4” 111001, Term “Posing” ll lOe, and Term “Bowing” ll lOf With the term "Kicking" ll lOg
- Walking 1110a is a key “Speed” 1112a of a plurality of child elements, and a key “E”. motionj 1112b, with Key “Direction” 1112c.
- Key “Direction” 1112 c has a plurality of child elements: Term “Forward” 1113 a, Term “Backward” 1113 b, Term “Rightward” 1113 c, and Term “Leftward” 1113 d.
- the conceptual model 4000 of the hierarchical structure description 130 manages the features of the material data by hierarchically managing a plurality of attributes “Key” and values “Term”.
- the management data can be simplified and the characteristics become clearer.
- FIG. 15 is a diagram showing a GUI displayed when creating an animation.
- a scene input unit 1401 for setting a computer's graphic scene name is provided.
- the scene input unit 1401 receives a scene name such as a scene number.
- a place name character string input unit 1402 for inputting information such as a scene of a computer, a situation, a location, etc. is provided.
- the place name character string input unit 1402 displays a list of place name character strings 801 that are available when the user inputs a character. For example, when the user enters a single character, all location name strings 801 having that character at the beginning are displayed. Also, when the user inputs two characters, all the place name character strings 801 at the beginning of these two characters are displayed.
- the user can recognize the place name character string 801 having the set data 114 by referring to the place name character string input unit 1402 and recognizes the desired place name character string 80 1 after recognizing it. It is possible to input. Further, below the place name character string input unit 1402 of the GUI 1400, there is a subject input unit 1403 for inputting the subject of the computer graphics.
- the subject input unit 1403 displays a list of subject noun strings 601 that are available when the user inputs a character. For example, when the user inputs one character, all subject noun strings 601 having that character at the beginning are displayed. Also, when the user inputs two characters, all the subject noun strings 601 at the beginning of the two characters are displayed.
- the subject noun string 601 having 112 can be recognized, and the desired subject noun string can be recognized
- an operation input unit 1404 for inputting the content to be operated by the character.
- the action input unit 1404 displays a list of verb strings 702 corresponding to the set subject noun character string 601. Also, the action input unit 1404 displays a list of the corresponding verb string 702 at the time when the user inputs a letter. For example, when the user inputs one character, all verb strings 702 preceded by that character are displayed. Also, when the user inputs two characters, all verb strings 702 at the beginning of the two characters are displayed.
- the action input unit 1404 displays a list of expression name strings 703 corresponding to the verb string 702 which has been manually input. Also in this case, the operation input unit 1404 displays a list of the corresponding expression name character strings 703 when the user inputs characters.
- the user can recognize the verb string 702 corresponding to the input character by referring to the operation input unit 1404, and can input the desired verb string 702 after recognition.
- a target word input unit 1405 for inputting an object on which an operation acts.
- the object input unit 1405 displays a list of object names 803 on which the set verb string 702 operates when the user sets the verb string 702 in the operation input unit 1404. Also, the object input unit 1405 displays a list of object names 803 that can be operated when the user inputs a character. For example, when the user inputs one character, all object names 803 having that character at the beginning are displayed.
- the user can recognize the object name 803 corresponding to the input verb character string 702 by referring to the object input unit 1405, and can input the desired object name 803 after recognizing the object name 803. .
- GUI 1400 At the lower right of GUI 1400, a computer corresponding to the scenario input by the user from place name character string input unit 14 02, subject input unit 1403, operation input unit 1404, and object input unit 1405 of GUI 1400.
- 'A preview display unit 1406 is provided to display a graphic preview.
- the preview button 1408 located below the preview display unit 1406 is used.
- preview display unit 1406 includes a preview corresponding to place name character string 801, a preview corresponding to subject noun character string 601, a preview corresponding to verb character string 702, and a preview corresponding to object name 803. Also display.
- the user selects a place name character string 801 selected by the place name character string input unit 1402, the subject input unit 1403, the action input unit 1404, and the object input unit 1405, and the subject If there is a plurality of data corresponding to the noun string 601, the verb string 702, and the object name 803, the location name string 801 used for the preview currently displayed on the preview display unit 1406, the subject noun string 601, A next candidate button 14 07 for selecting data other than the data corresponding to the verb string 702 and the object name 803 is provided.
- a send button 1409 is provided to send to the display control unit 127 the selected place name character string 801, the subject noun character string 601, the verb character string 702, and the object name 803 corresponding to one.
- FIG. 16 is a block diagram of a GUI displayed when searching for material data.
- a feature input unit 1502 for inputting features to search for material data is disposed.
- the feature input unit 1502 extracts from the hierarchical structure description 130 and displays a feature (Term, Key) that partially matches the character input when the user inputs the character. For example, when you enter a user's character, all features (Term, Key) with the character at the beginning are displayed. Also, when the user inputs two characters, all the features (Term, Key) at the beginning of these two characters are displayed.
- the GUI 1500 displays a feature display (Term, Key) corresponding to the determined feature (Term, Key). It has part 1503. Furthermore, when the user determines a feature in the feature display unit 1503, the GUI 1500 displays a feature display unit 1504 that displays lower or upper feature (Term, Key) corresponding to the determined feature (Term, Key). Have.
- the number of feature display sections 1503 and 1504 is not limited to the example shown in FIG. 15 and may be any number.
- a thumbnail display unit 1505 for displaying a thumbnail of material data for the determined feature is provided.
- the user can grasp the outline and the number of the material data corresponding to the feature determined in the feature display units 1503 and 1504 by looking at the thumbnail display unit 1505.
- FIG. 17 is a flowchart of scenario determination processing of the image creating apparatus 100 according to the present embodiment.
- the input control unit 126 is configured to input a scene of the GUI 1400. It is monitored whether there is a scene input from the unit 1401. If there is a scene input, the character string 'CG conversion processing unit 131 sets the scene (S1701). There may be times when you need to set the scene. Also, there may be a plurality of scenes set.
- image creation apparatus 100 monitors input control section 126 whether there is an input of place name character string 801 from place name character string input section 1402 of GUI 1400, and there is an input of place name character string 801. If so, the character string 'CG conversion processing unit 131 determines the set (S1702). The details of the process of S1702 will be described later. There is also a case where animation is created with default or no set without selecting a set.
- the character string 'CG conversion processing unit 131 determines the character (S1703). In addition, the terms of the processing of S1703 Ivy (it will be described later.
- the image creating apparatus 100 searches the motion dictionary 124 (S1705), and determines whether the motion determined in S1704 requires an object (S1706).
- the character string “CG conversion processing unit 131 refers to the character string / set correspondence table 122, extracts the object list, and sends it to the display control unit 127. Then, the display control unit 127 displays the object name list. Then, the image creating apparatus 100 monitors in the input control unit 127 whether there is an input of the object name 803 from the word input unit 1005 of the GUI 1400, and if there is an input of the object name 803, the character string ' The CG conversion processing unit 131 determines an object (S1707). The details of the process of S 1 707 will be described later.
- the image creating apparatus 100 monitors, in the input control unit 126, whether there is a scene addition from the scene input unit 1401 of the GUI 1400 (S1708), and if there is a scene addition, the scene is added to the added scene. Then, the processing of S1701 and later is performed. If it is determined in S1706 that the object is not necessary, the image creating apparatus 10 skips the process of S1707 and shifts to the process of S1708.
- the character string 'CG conversion storage unit 131 receives material data (character data 112, operation data 113, set data 114) for the set, characters, operation, and object determined in the above-described processing. Acquired (S1709), a moving image is generated using the acquired material data (S1710), and the generated moving image is displayed on the monitor 105 (S1711).
- the image creating apparatus 100 receives the input of the feature for the scenario, acquires the material data according to the feature, and uses the material data to obtain the material data. Create a video.
- the image creating apparatus 100 determines a scenario for creating computer 'graphics, and creates a moving image.
- FIG. 18 is a flowchart of the set determination process of the image creating apparatus 100.
- the input control unit 126 of the image creating apparatus 100 monitors whether there is a scenario input from the input unit 104 to the GUI 1400 (S 1801). Specifically, in S1801, it is monitored whether the place name character string input unit 1402 receives a force such as mouse selection (click processing) or the input of the place name character string 801.
- the input control unit 126 displays the input character when there is an input of the place name character string 801, and when there is a click processing, the character string ⁇ Send to CG conversion processing unit 131.
- the character string 'CG conversion processing unit 131 searches the character string' set correspondence table 122 (S 18 02). Then, when the character string 'CG conversion processing unit 131 receives a character related to the place name character string 801, it searches for a place name character string 801 having the sent character at the beginning.
- the character string'CG conversion unit 131 receives the information indicating that the click processing has been performed, the character string' 1'-set correspondence table 122 is searched for all the place name character strings 801 (S 1803). ).
- the processing unit 131 sends the detected place name character string 801 to the display control unit 127. Then, the display control unit 127 arranges and displays the sent place name character string 801 in the input characters of the place name character string input unit 1402 (S1804).
- the character string 'CG conversion processing unit 131 searches all the place name characters from the character string' set correspondence table 122 ⁇ IJ801. (S 1802, 1803), and sends it to the display control unit 127. Then, as shown in FIG. 19, the display control unit 127 performs the search processing of the set data 114 by the search unit 128 and the “" " " " 3 ⁇ 1001 " "More ⁇ ⁇ ⁇ " are displayed side by side in the place name string input unit 1402 (S 1804).
- the input control unit 126 monitors whether there is an input of an additional scenario of the scenario input in S1801 (S1805). Then, if there is an additional scenario in S1805, the processing after S1803 is performed on the character string including the additional scenario.
- the input control unit 126 monitors whether the user determines the place name character string 801 using the place name character string input unit 1402, and when the user determines the place name character string 801, the determination is made.
- the place name string 801 is sent to the character string ⁇ CG conversion processing unit 131.
- the character string 'CG conversion processing unit 131 refers to the character / set correspondence table 122 and extracts the link 802 to the set data 114 corresponding to the location name character string 801 sent. .
- the character string 'CG conversion processing unit 131 extracts the set data 114 using the extracted link information 802, and sends the extracted set data 114 to the display control unit 127.
- the display control unit 127 displays a preview on the preview display unit 1406 of the GUI 1400 on the sent set data 114 (S 1806).
- display control unit 127 For example, by selecting the first set data 114 or the like, a preview for the selected set data 114 is displayed on the preview display unit 1406 of the GUI 1400.
- the input control unit 126 forces the next candidate button 1407 of the GUI 1400 to be pressed, and the “more '-is selected force displayed on the place name character string input unit 1402. Send button 1309 is pressed. It is monitored that it is done (S1807).
- the display control unit 127 ends the processing, and sends the generated animation from the transmitting / receiving unit 106 to another image creating apparatus 100 on the Internet 107. .
- sending to another image creating apparatus 100 only animation data or animation data and scenario data are sent.
- the other image creating apparatus 100 causes the display control unit 127 to display a preview on the preview display unit 1406 of the GUI 1400.
- the display control unit 127 displays a preview on the preview display unit 1406 of the GUI 1400 and also inputs the scenario in the same manner as the transmitting image forming apparatus 100. Editable.
- the character string 'CG conversion processing unit 131 corresponds to the place name character string 80 1 selected by the user in the place name character string input unit 1402. Of the set data 114, a preview is displayed at S1806 to select a set data 114 other than the set data 114. Then, the character string 'CG conversion processing unit 131 sends the selected set data 114 to the display control unit 127, and the display control unit 127 creates a preview using the sent set data 114, and the preview display unit 1406. Display (S1808), and shift to the processing of S1807.
- search unit 128 performs search processing (S1809), and shifts to the processing of S1807. The details of the search process (S1809) will be described later. As described above, it is possible to display a list of the place name character strings 801 of the set data 114 held for the user. This allows the user to easily grasp what set data 114 exists.
- FIG. 20 is a flow chart of character determination processing of the image creating apparatus 100.
- the input control unit 126 of the image creating apparatus 100 monitors whether there is a scenario input from the input unit 104 to the GUI 1400 (S 2001). Specifically, in S2001, it is monitored whether the subject input unit 1403 has an input of a force ⁇ or a place name character string with a selection (click processing) such as a mouse.
- the character string 'CG conversion processing unit 131 searches the character ⁇ ⁇ 1'-character correspondence table 118 (S2002). Then, when the character string 'CG conversion processing unit 131 receives a character related to the subject noun character string 601, the character string' CG conversion processing unit 131 searches for the subject noun character string 601 having the sent character at the beginning. Also, upon receiving information indicating that the click processing has been performed, the character string 'CG conversion unit 131 searches all subject noun character strings 601 in the character / character correspondence table 118 (S2003).
- the character string 'CG conversion processing unit 131 sends the detected subject noun character string 601 to the display control unit 127. Then, the display control unit 127 arranges and displays the sent subject noun character string 601 on the characters input in the subject input unit 1403 (S2004).
- the character string / CG conversion processing unit 131 starts the “G” in the character string / character correspondence table 118.
- Lord of the The word noun string 601 is searched (S 2002, 2003), and sent to the display control unit 127.
- the display control unit 127 executes the search processing of the character data 112 by the search unit 128 and “Girl” and “Gorilla” which are the subject noun character strings 601 sent. “More '''” are displayed side by side in the subject input unit 1403 (S2004).
- the subject noun character strings 601 with the largest number of selections are arranged in order. Furthermore, the subject noun strings 601 may be arranged in descending order of frequency of use of the subject noun strings 601 when the place name string 801 is selected.
- the input control unit 126 monitors that there is an input of an additional scenario of the scenario input in S2001 (S2005). Then, if there is an additional scenario in S2005, the processing after S2003 is performed on the string including the additional scenario.
- the input control unit 126 monitors whether the user determines the subject noun character string 601 by using the subject input unit 1403, and when the user determines the subject noun character string 601, the determined subject noun character Send the column 601 to the character string ⁇ CG conversion processing unit 131.
- the character string 'CG conversion processing unit 131 refers to the character ⁇ IJ' character correspondence table 118 and links 602 to the character data 112 corresponding to the subject noun character string 601 sent. Extract.
- the character string ′ CG conversion processing unit 131 extracts the character data 112 using the extracted link information 602, and sends the extracted character data 112 to the display control unit 127.
- the display control unit 127 displays a preview of the sent character data 112 on the preview display unit 1406 of the GUI 1400 (S2006).
- display control unit 127 selects, for example, the first character data 112, etc., and selects it in preview display unit 1406 of GUI 1400. A preview for the character data 112 is displayed.
- the input control unit 126 forces the next candidate button 1407 of the GUI 1400 to be pressed, and the key transmit button 1409 to which "more ','" displayed on the subject input unit 1403 is selected is pressed. Monitor the situation (S2007).
- the display control unit 127 ends the processing, and the generated animation is transmitted from the transmitting / receiving unit 106 to another image creating apparatus 100 on the Internet 107.
- Send to When sending to another image creating apparatus 100 only animation data or animation data and scenario data are sent.
- the display control unit 127 displays a preview on the preview display unit 1406 of the GUI 1400 when another apparatus for image creation 100 receives the animation data.
- animation data and scenario data are received by the transmitting and receiving unit 106, the display control unit 127 displays a preview on the preview display unit 1406 of the GUI 14 ⁇ °, and scenario input 'editing' It can do.
- the character string 'CG conversion processing unit 131 determines the character data corresponding to the subject noun character string 601 selected by the user at the subject input unit 1403 112 Among them, the preview is displayed in S2006, and character data 112 other than character data 112 is selected. Then, the character string 'CG conversion processing unit 131 sends the selected character data 112 to the display control unit 127, and the display control unit 127 creates a preview using the sent character data 112, and displays a preview. Word 1406 Display (S2008), and move on to the processing of S2007.
- S2008 Word 1406 Display
- the preview of all the character data 112 corresponding to the subject noun character string 601 selected by the user can be displayed.
- the user can select the character data 112 corresponding to the selected subject noun string 601 after viewing the preview.
- search unit 128 If “more '' '” displayed in subject input unit 1403 is selected in S2007, that is, if it is determined that execution of a search process for character data 112 by search unit 128 is instructed, the search is performed.
- the part 128 performs search processing (S2009), and shifts to the processing of S2007. The details of the search process (S2009) will be described later.
- a list of subject noun character strings 601 of the character data 112 to be held can be displayed for the user. Thereby, the user can easily grasp the power of what character data 112 exists. Further, by displaying a list of selectable subject noun strings 601 of selectable character data 112 in the GUI 1400, the user does not have to analyze the character ⁇ 1J ′ character correspondence table 118, and can be easily selected. The list of subject noun strings 601 can be grasped.
- FIG. 22 is a flowchart of operation determination processing of the image creating apparatus 100.
- the input control unit 126 of the image creating apparatus 100 monitors whether there is a scenario input from the input unit 104 to the GUI 1400 (S 2201). Specifically, it checks in S2201 whether there is a selection (click processing) such as a mouse in the operation input unit 1404 or whether there is an input of the verb string 702.
- the input character is sent, and if there is a click processing, the character string ⁇ CG conversion processing unit 131 is sent to that effect.
- the character string 'CG conversion processing unit 131 searches the character' operation correspondence table 120 (S2 202). Then, when the character string 'CG conversion processing unit 131 receives a character related to the verb character string 702, it corresponds to the subject name character string 601 determined in the character determination processing (S1703) shown in FIG. Search for a verb string 802 that starts with the character that has been entered. In addition, when the character string 'CG conversion unit 131 receives the information indicating that the click processing has been performed, all verb character strings 702 corresponding to the subject noun character string 601 determined in the character determination processing (S1703) shown in FIG. Is searched (S2203).
- the string “CG conversion processing unit 131 sends the detected verb string 702 to the display control unit 127.
- the display control unit 127 displays the sent verb string 702 side by side with the input characters of the operation input unit 1404 (S2204).
- the character string ′ CG conversion processing unit 131 is determined in the character determination process (S 1703) shown in FIG.
- All verb strings 702 corresponding to the subject noun string 601 (for example, “Girl”) are retrieved (S 2202, 2203), and sent to the display control unit 127.
- the display control unit 127 sends the verb strings 702 "goes cl ose to", "sits 011" and "31: & 11 (13 up)", and the characters by the search unit 128.
- “More” ⁇ for executing search processing of the data 112 are displayed side by side on the operation input unit 1404 (S2204).
- the user can recognize what verb strings 702 can be selected.
- the verb strings 702 with the largest number of selections are arranged in order, using the history that the user has selected so far. Furthermore, the verb strings 702 are arranged in descending order of usage frequency of the verb string 702 when the subject noun string 601 is selected. Or, let's arrange the verb string 702 in descending order of frequency of use of the verb string 702 when both the place name string 801 and the subject noun character ⁇ IJ 601 are selected.
- the input control unit 126 monitors that there is an input of an additional scenario of the scenario input in S2201 (S2205). Then, if there are additional scenarios in S2205, the process from S2203 is performed on the character string including the additional scenarios.
- the character string 'CG conversion processing unit 131 searches the character' action correspondence table 120 and corresponds to the verb string 702 displayed in S2204. A search is made for the expression name string 703 (S2206).
- the character string 'CG conversion processing unit 131 sends the detected expression name character string 703 to the display control unit 127. Then, the display control unit 127 arranges and displays the sent presentation name character string 703 in the vicinity of the operation input unit 1404 (S2207).
- the input control unit 126 monitors that there is an (additional) character (scenario) input for the expression name character string 703 (S2208). Then, if there is an additional scenario in S2208, the processing after S2206 is performed on the character string including the additional scenario.
- the character string 'CG conversion processing unit 131 searches for the expression name character string 703 having the input character at the beginning. Then, the display control unit 127 displays the expression name letter IJ 703 retrieved on the monitor. As described above, by displaying the list of expression name character strings 703, the user can recognize what expression name character strings 703 can be selected.
- the history selected by the user is arranged in order from the expression name character string 703 which is frequently selected.
- the expression name character string 703 is arranged in descending order of frequency of use of the expression name character string 703 when the subject noun character string 601 is selected, or the state name character string 801 and the subject noun character string 601 are both selected.
- the verb string 702 is arranged in descending order of frequency of use of the expression name string 703 in, or the expression name string 703 is ordered in order of frequent use of the expression name string 703 in a state where the verb string 702 is selected.
- Place name string 801, subject noun string 601, and subject verb string 601, and verb letters by placing place name string 703 in descending order of frequency of use of place name string 703 with place name string 801 and verb string 702 both selected.
- the expression name strings 703 may be arranged in descending order of the frequency of use of the expression name strings 703 in a state where three columns 702 are selected.
- the input control unit 127 monitors whether or not the user determines the verb string 702 and the expression name string 703 using the operation input unit 1404 of the GUI 1400, etc., and the user determines the verb string 702 and the expression name When the string 703 is determined, the determined verb string 702 and the expression name string 703 are sent to the string / CG conversion processing unit 131.
- the character string 'CG conversion processing unit 131 refers to the character' operation correspondence table 120, and links to the operation data corresponding to the verb string 702 and the expression name character string 703 that have been sent. Extract
- the character string ′ CG conversion processing unit 131 extracts the operation data 113 using the extracted link information 704, and sends the extracted operation data 113 to the display control unit 127.
- the display control unit 127 displays a preview of the received operation data 113 on the preview display unit 1406 of the GUI 1400 (S 2209).
- the display control unit 127 selects, for example, the first operation data 113 and the like, and the operation selected in the preview display unit 1406 of the GUI 1400. Display a preview for data 113.
- the input control unit 126 causes the next candidate button 1407 of the GUI 1400 to be pressed. Force to select “more, ⁇ ” displayed on the input unit 1404. It is monitored that the send button 1309 is pressed (S2210).
- the display control unit 127 ends the process. If it is determined that the next candidate button 1407 has been pressed, the character string 'CG conversion processing unit 131 selects S2209 of the action data 113 corresponding to the verb character string 702 and the expression name character string 703 selected by the user in the GUI 1400. Select the operation data other than the operation data displaying the preview in. Then, the character string 'CG conversion processing unit 131 sends the selected operation data to the display control unit 127, and the display control unit 127 creates a preview using the sent operation data and displays it on the preview display unit 1406. (S2211), the process proceeds to S2210.
- search unit 128 when “more '' '” displayed in the operation input unit 1404 is selected in S 2207, that is, when it is judged that the execution of the search processing of the operation data 113 by the search unit 128 is instructed, the search is performed.
- the unit 128 performs a search process (S2212), and shifts to the process of S2207. The details of the search process (S2212) will be described later.
- the user does not need to analyze the character 'action correspondence table 120, and can easily select. You can understand the list of the verb string 702 and the expression name string 703.
- FIG. 24 is a flowchart of object determination processing of the image creating apparatus 100.
- the input control unit 126 of the image creating apparatus 100 monitors whether there is a scenario input from the input unit 104 to the GUI 1400 (S 2401). Specifically, in S 2401, it is monitored whether the object input unit 1405 has a certain selection (click processing) such as mouse or input of an object name.
- the input control unit 127 if there is an input of an object name, the character that was input, and if there is a click processing, the character string ⁇ CG Send to the conversion processing unit 131.
- the character string 'CG conversion processing unit 131 searches the character string / set correspondence table 122 (S2402). Then, when the character string 'CG conversion processing unit 131 receives the character related to the object name 803, the object name corresponding to the place name character string 801 determined in S1702 and having the sent character at the beginning. Search for 803. Also, when the character string 'CG conversion unit 131 receives the information indicating that the click processing has been performed, all objects corresponding to the place name character string 801 determined in S1702 stored in the character / one-set correspondence table 122 The name 803 is searched (S2403).
- the character string “CG conversion processing unit 131 sends the detected object name 803 to the display control unit 131. Then, the display control unit 131 arranges and displays the sent object name 803 on the characters input in the object input unit 1405 (S2404).
- the character conversion unit CG conversion processing unit 131 converts the character name set correspondence table 122 into the place name character string 801 determined in S 1702. All corresponding object names 803 are retrieved (S 2402, 2403) and sent to the display control unit 127. Then, as shown in FIG. 25, the display control unit 127 executes the search processing of the set data 114 by the search unit 128, “the 611 (11) and“ s 116 slide ”which are the transmitted object names 803. In order to display "more ' ⁇ and", the object input unit 1405 is displayed side by side (S2404).
- the user can recognize what object names 803 can be selected.
- the objects are arranged in order from the object name 803 with the highest number of selections.
- the input control unit 126 monitors that there is an input of an additional scenario of the scenario input in S2401 (S2405). Then, if there are additional scenarios in S2405, the processing after S2403 is performed on the character string including the additional scenarios.
- the input control unit 126 monitors whether or not the user determines the object name 8003 using the object input unit 1405, and when the user determines the object name 803, the determined field parameter 805 is determined.
- the display control unit 127 operates in an operation in which a character that has already been selected and selected is sent to the selected object parameter 805 (eg, sitting on a chair).
- a preview is displayed on the preview display unit 1406 of the GUI 1400 (S2406).
- the input control unit 126 forces the next candidate button 1407 of the GUI 1400 to be pressed.
- the force to which “more '' '” displayed on the object input unit 1405 is selected. It is monitored that it is done (S2407).
- the display control unit 127 ends the process, and if it is determined that the next candidate button 1407 has been pressed, a character string • CG conversion process
- the unit 131 selects an object other than the object displaying the preview in S2406 among the object data corresponding to the object name 803 selected by the user in the object input unit 1405. Then, the character string 'CG conversion processing unit 131 sends the selected object parameter to the display control unit 127, and the display control unit 127 creates a preview using the sent object parameter, and the preview display unit 1406 Display (S2408), and process S2407 ii (This moves on.
- search unit 128 executes the search process of the set data (object data) 114. Is determined, the search unit 128 performs a search process (S2409), and shifts to the process of S2407. The details of the search process (S2409) will be described later.
- a list of selectable object names 803 can be displayed for the user.
- the user can easily grasp what object name 803 can be selected.
- the object names 803 are arranged in descending order of frequency of use of the object name 803.
- a location name string 801 and a verb string 702 are used when both object names 803 are used.
- the object names 803 are arranged in order of frequency, or the object names 803 are arranged in order of frequency of use of the object names 803 when three place name character strings 801, subject noun character strings 601 and verb character strings 702 are selected. Let's see.
- the image creating apparatus 100 can also input a scenario consisting of a plurality of sentences to the GUI 1400. In this case, you can create computer graphics arranged in order of upper level writing ability.
- a character (“Man”) different from the character (“Girl”) of the first sentence appears as shown in 2601 in the figure. I have something to do.
- the other character is added to the target object on which one character operates.
- “Man” 2701 is added to the “Giri” object
- “Girl” 2702 is added to the “Man” object.
- the place name character string is Park, School
- the subject noun character string is Giri, Man
- the verb character string is “Jitabata”, “Jijinpu”, sits on, goes close To, stands up, says, ⁇ ⁇ [J as really, power with arguably S, not limited to these.
- examples of subject noun strings are general nouns, but they are frequently used It is also possible to use proper nouns by assigning names of proper nouns to character data and storing them (example: Girl as Ai).
- the character adjective character string 603 and the object adjective character string 804 are displayed before and at the time of displaying character data and object data so that differences in data can be easily understood. It may be shown, and character data and object data may be further searched (filtered) by displaying a list in or next to the subject input unit 1403 and the subject input unit 1405, respectively.
- one or more users place character 'character correspondence table 118, character' operation correspondence table 120, character collection 'set correspondence table 122 on server 108 114 of Internet 107. Can be shared. Also, it is possible to use the latest material data created by other users, etc. which can use a large amount of material data.
- object name 803 can be described and stored sequentially from the character string with the highest number of selections based on the history selected by multiple users so that the most frequently used character string can be searched first.
- the link information 802 may be stored in the order of the number of selections based on the history selected by a plurality of users, and the link information power to the data in order of description, so that data with high usage frequency may be displayed first. .
- the motion dictionary 124 may be stored on the Internet 104 and shared by one or more users.
- FIG. 28 is a flowchart of search processing of the image creating apparatus 100 according to the present embodiment.
- the image creating apparatus 100 searches for the set data 114 in S1809, searches for character data 112 in S20 09, searches for operation data 113 in S2209, and S2409. Searches the object data (set data 114).
- character data 112, operation data 113, and set data 114 will be described as material data.
- the display control unit 127 of the image creating apparatus 100 creates a GUI 1500 which is a screen for searching for material data, and displays it on the monitor 105 (S 2801). And image creation device
- the input control unit 126 of the image creating apparatus 100 monitors whether or not the material is input from the feature input unit 1502 (S 2802), and when a feature is input, sends a notification to that effect to the search unit 128.
- the image creating apparatus 100 refers to the hierarchical structure description 130, and the Term or Key partially matches the input feature.
- the list is displayed on the feature input unit 1502.
- the search unit 128 performs a search process of material data for the feature input in S2802 (S2803).
- S2803 The details of the process of S2803 will be described later.
- the registration unit 129 of the image creating apparatus 100 acquires metadata corresponding to the material data determined in S2803 from the metadata database 111.
- the registration unit 129 uses the link information described in the acquired metadata to select the material data (characters) determined in S2803 from the character data database 108, the action data database 109, or the set data database 110.
- Data 112, operation data 113, and set data 114) are acquired (S2804).
- the registration unit 129 registers the content of the metadata acquired in S2804, that is, the search result. Specifically, if the metadata acquired in S2804 relates to a character, the registration unit 129 adds the content of the metadata to the character string / character correspondence table 118, and if the metadata acquired in S2804 relates to an operation, The contents of the data are added to the character operation correspondence table 120, and if the metadata acquired in S2804 relates to a set, the contents of the metadata are registered in the character ⁇ IJ 'set correspondence table 122 (S2805).
- the search results can be reflected in the character string 1] character correspondence table 118, the character / operation correspondence table 120, and the character string 'set correspondence table 122.
- the material searched by the user once Information on data will be displayed on the GUI 1400 for scenario input thereafter. Therefore, it is possible for the user to use the material data once searched without searching again.
- the registration unit 129 adds the acquired material data. Specifically, when the acquired material data is the character data 112, the registration unit 129 stores the acquired operation data in the operation data storage unit 116 in the case of the operation data 113, and the acquired material data is the set data 114. In the case, the acquired material data is stored in the set data storage unit 117 (S2806).
- FIG. 29 is a flowchart of material data search processing of the image creating apparatus 100 according to the present embodiment.
- the search unit 128 of the image creation device 100 inputs the feature (word) sent from the input control unit 126 (S2901).
- the search unit 128 refers to the hierarchical structure description 130 and searches for a feature (word) or a similar word of the feature input in S2 901 (S2902).
- the search unit 128 determines whether the searched word is a word related to Key or a word related to Term (S2903).
- the retrieval unit 128 refers to the hierarchical structure description 130 and retrieves the Term of the child element of the word (Key) retrieved in S2902, and performs retrieval Sends a list of the terms that have been sent to the display control unit 127. Then, the display control unit 127 displays the list of sent terms on the feature display unit 1503 of the GUI 1500 (S2904).
- the terms to be displayed are prioritized in terms of user usage history, in alphabetical order, or prioritized in advance in terms of high priority.
- input control unit 126 monitors whether or not the desired Term has been selected from the list of terms displayed on feature display unit 1503 by the user, and if the desired Term is selected, the selection is made. Send the existing Term to the search unit 128. Then, the search unit 128 determines the term that has been sent (S2905).
- the search unit 128 refers to the metadata database 111 and searches metadata corresponding to the determined Term.
- the search unit 128 sends the searched metadata to the character string 'CG conversion storage unit 131, and using the metadata sent from the character string / CG conversion storage unit 131, obtains a thumbnail of the determined Term.
- the list of acquired thumbnails (search results) is sent to the display control unit 127.
- the display control unit 127 displays the list of thumbnails on the thumbnail display unit 1505 of the GUI 1500 (S2906).
- the step S 128 refers to the hierarchical structure description 130, searches K ey of the child element of the Term determined in S 2905, and sends a list of the searched keys to the display control unit 127. Then, the display control unit 127 displays the list of sent keys on the feature display unit 1504 of the GUI 1500 (S2909).
- the input control unit 126 monitors whether the desired Key has been selected from the list of Keys displayed on the feature display unit 1504 by the user, and if the desired Key has been selected, it is selected.
- the selected Key is sent to the search unit 128.
- the search unit 128 determines the sent Key (S2910).
- the search unit 128 processes the processing of S2904 and subsequent steps for the key determined in S2910. To determine the material data for the scenario material.
- the image creation apparatus 100 inputs “Wear”, “Key” to the feature input unit 1502 (S 2901), the hierarchical structure description 130 to “Wear” are used.
- the terms “Shirt”, “Trouser”, and “Glasses”, which are terms of the child elements of, are searched and displayed on the feature display unit 1503 (S2902 to S2904).
- the image creating apparatus 100 displays the hierarchical structure description 130
- the image creating apparatus 100 determines “Trouser. Color” (S 2910), refers to the hierarchical structure description 130, and “Trouser.
- the terms “Blue”, “Gray”, “Green”, and “Brown”, which are child elements of “Color”, are searched and displayed on the feature display unit 3001 (S2904).
- the image creating apparatus 100 determines “Blue” (S2905), and displays thumbnails of material data for "Blue” as thumbnails. Display on the display unit 1505 (S2906). Then, by selecting a desired thumbnail from the displayed thumbnails, the user selects material data for that thumbnail.
- the image creating apparatus 100 searches for the material data for the input key and the term corresponding thereto.
- the search unit 128 refers to the hierarchical structure description 130 and searches K ey of the parent element of the word (Term) searched in S2902. It is determined whether there is a plurality of retrieved keys (S2911). If it is determined in S 2911 that there are a plurality of Keys that can be parent elements, a list of the Keys is sent to the display control unit 127. Then, the display control unit 127 displays the list of sent keys on the feature display unit 1503 of the GUI 1500 (S2912).
- the number of displayed keys may be set to the number desired by the user, etc.
- the Displayed Key gives priority to those with a large usage history of the user, gives an alphabetical order, gives priority to those with high priority given to the Key in advance, and various applications. Is considered.
- the input control unit 126 monitors whether the user has selected a desired key from the list of keys displayed on the feature display unit 1503 by the user. If the desired key has been selected, the user selects the desired key. The selected Key is sent to the search unit 128. Then, the search unit 128 determines the sent Key (S2913).
- the search team sends the Key determined in S2913 and the term compiled in S2901 to the search unit 128, and the search unit 128 refers to the metadata database 111 and determines the Key and Term determined. Retrieve the corresponding metadata.
- the search unit 128 sends the searched metadata to the character string / CG conversion storage unit 131, and the character string / CG conversion storage unit 131 uses the metadata sent from it to determine the thumbnails of the determined Key and Term. Is acquired, and a list of the acquired thumbnails (search results) is sent to the display control unit 127. Then, the display control unit 127 displays a list of thumbnails on the thumbnail display unit 1505 of the GUI 1500 (S2914).
- the user can visually recognize the outline of the material data for the determined Key and Term by looking at the thumbnails.
- the image creating apparatus 100 determines in the input control unit 126 whether there is an input for setting the search conditions (Term) in more detail by the user, that is, an input to be searched using the upper concept (see FIG. S2915).
- the search unit 128 refers to the hierarchical structure description 130 and searches for a Term to be a parent element of the Key determined in S2913, It is determined whether there are a plurality of retrieved terms (S2916).
- input control section 126 monitors whether the desired Term has been selected from the list of Terms displayed on feature display section 1504 by the user, and if the desired Term has been selected, select Send the existing Term to the search unit 128. Then, the search unit 128 determines the sent Term (S2918).
- the search unit 128 performs the processing of S2911 and later on the Term determined in S2918.
- the search unit 128 determines whether this Key is a force that is the Root element, that is, the highest key (S2919). )
- the search unit 128 sends the Key determined in S2919 and the Key determined in S2911 or S2913 to the search unit 128, and the search unit 128 refers to the metadata database 105 for determination. Search metadata corresponding to Key and Term.
- the search unit 128 sends the searched metadata to the character string 'CG conversion storage unit 131, and the character string' CG conversion storage unit 1
- a list 31 (search results) of thumbnails acquired from link information in the search results to which 31 is sent is sent to the display control unit 127.
- the display control unit 127 displays a list of thumbnails on the thumbnail display unit 605 of the GUI 600 (S2921).
- the image creating apparatus 100 waits for selection of any thumbnail from the list of thumbnails displayed in S2921, and when selected, determines material data corresponding to the selected thumbnail, and performs processing finish.
- the image creating apparatus 100 displays thumbnails corresponding to “Shirt. Color” selected by the user as thumbnails 1 505 (S2913, S2914).
- the image creating apparatus 100 determines that the key of the parent element of “3 11” and “31 111” “Wear” is displayed on the feature display unit 1504, thumbnails for “Shirt” and “Wea” are displayed, and are selected by the user (S2915, S2916, S2911 and S2914).
- the image creation device 100 determines that the key “Genre” of the parent element of "Human” and “Human” is used. Are displayed on the feature display unit 3101, thumbnails for "Human” and “Genre” are displayed, and the user selects (S2915, S2916, S2911-S2914).
- the image creating apparatus 100 when the user tries to perform a search using “Character,” which is an upper Term of “Genre,” the image creating apparatus 100 generates “01 & & 61 ⁇ ” and “01 & ( ⁇ 61 ⁇ )
- the key “Content” of the parent element of is displayed on the feature display unit 3102, thumbnails for “Character” and “Content” are displayed, and the user is selected (S2915, S2916, S2911, S2919, S2921)
- the image creating device 100 searches for material data using a set of a key and a parent element Key having the input Term when the characteristic force STerm for the material of the scenario is input.
- the search range is expanded by using the upper Term and the key of the parent element to search Can.
- the user can search for desired material data from a suitable number of material data candidates.
- material data can be searched while expanding the search range from the features of the lower concept to the features of the upper concept, in other words, from the detailed features to the rough features.
- computer 'graphics creation processing can be performed while material data can be searched, and computer' graphic creation processing can be performed using the searched material data.
- computer 'graphics creation process and material data search process can be performed in a series of operations.
- the computer 'graphics creation process can be accelerated and assured.
- material data corresponding to the feature can be searched. This makes it possible to easily search for desired material data without knowing the name of the material data.
- processing performed by the image generation device 100 may be a program that is executed by a general-purpose computer.
- the processing of creating computer 'graphics from scenario input, processing of newly searching for material data, and processing of registering newly searched material data are performed by a series of operations.
- the processing of creating computer's graphics from scenario input, the processing of newly searching for material data, and the processing of registering newly searched material data may be performed in different apparatuses.
- the present invention can be applied to a wide range of fields such as mobile phones and other communication devices such as mobile phones that use computer's graphics to transmit desired messages and information to other users.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Library & Information Science (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/597,464 US7797330B2 (en) | 2004-01-27 | 2005-01-26 | Image formation device and image formation method |
EP05704126A EP1703468A1 (en) | 2004-01-27 | 2005-01-26 | Image formation device and image formation method |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-018839 | 2004-01-27 | ||
JP2004018839 | 2004-01-27 | ||
JP2004029599 | 2004-02-05 | ||
JP2004-029599 | 2004-02-05 | ||
JP2005017468A JP4603373B2 (ja) | 2004-01-27 | 2005-01-25 | 画像作成装置および画像作成方法 |
JP2005-017468 | 2005-01-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005071616A1 true WO2005071616A1 (ja) | 2005-08-04 |
Family
ID=34811795
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/000989 WO2005071616A1 (ja) | 2004-01-27 | 2005-01-26 | 画像作成装置および画像作成方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US7797330B2 (ja) |
EP (1) | EP1703468A1 (ja) |
JP (1) | JP4603373B2 (ja) |
KR (1) | KR20060128962A (ja) |
WO (1) | WO2005071616A1 (ja) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008134966A (ja) * | 2006-11-29 | 2008-06-12 | Sony Corp | データ管理サーバ、データ管理システム、データ管理方法およびプログラム |
US20100077359A1 (en) * | 2006-12-19 | 2010-03-25 | Yoshihiro Shinawaki | Map display device |
JP2008299493A (ja) * | 2007-05-30 | 2008-12-11 | Kaoru Sumi | コンテンツ作成支援システム及びコンピュータプログラム |
US8296438B2 (en) * | 2007-07-11 | 2012-10-23 | International Business Machines Corporation | Dynamically configuring a router to find the best DHCP server |
US20140198109A1 (en) * | 2013-01-16 | 2014-07-17 | International Business Machines Corporation | Method and system for preserving a graphics file |
JP5908855B2 (ja) * | 2013-02-21 | 2016-04-26 | 日本電信電話株式会社 | 3次元オブジェクト生成装置、方法、及びプログラム |
GB2519312A (en) * | 2013-10-16 | 2015-04-22 | Nokia Technologies Oy | An apparatus for associating images with electronic text and associated methods |
US20220084424A1 (en) * | 2020-09-16 | 2022-03-17 | Daniel Gray | Interactive communication system for special needs individuals |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04264972A (ja) * | 1991-02-20 | 1992-09-21 | Sharp Corp | 自然言語処理装置及びそれを利用した動画形成装置 |
JPH08263681A (ja) * | 1995-03-22 | 1996-10-11 | Matsushita Electric Ind Co Ltd | アニメーション作成装置およびその方法 |
JPH09167251A (ja) * | 1995-12-14 | 1997-06-24 | Canon Inc | アニメーション生成装置及びその方法 |
JPH09167165A (ja) * | 1995-12-15 | 1997-06-24 | Matsushita Electric Ind Co Ltd | 動画像生成方法 |
JP2000331182A (ja) * | 1999-05-21 | 2000-11-30 | Fujitsu Ltd | アニメーション編集装置とアニメーション再生装置とプログラム記録媒体 |
JP2001243221A (ja) * | 2000-03-02 | 2001-09-07 | Takuo Kitamura | 電子機器の文字入力方法と文字入力システム |
JP2002108875A (ja) * | 2000-10-04 | 2002-04-12 | Soatec Inc | 電子マニュアル装置及び電子マニュアルの変更方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3133243B2 (ja) | 1995-12-15 | 2001-02-05 | 株式会社エヌケーインベストメント | オンラインショッピングシステム |
US6654031B1 (en) * | 1999-10-15 | 2003-11-25 | Hitachi Kokusai Electric Inc. | Method of editing a video program with variable view point of picked-up image and computer program product for displaying video program |
JP2001125894A (ja) | 1999-10-29 | 2001-05-11 | Sony Corp | 文書編集処理装置及び文書編集処理方法およびプログラム提供媒体 |
US20020016707A1 (en) * | 2000-04-04 | 2002-02-07 | Igor Devoino | Modeling of graphic images from text |
JP2001307137A (ja) | 2000-04-21 | 2001-11-02 | Sony Corp | 情報処理装置および方法、並びに格納媒体 |
US6721706B1 (en) * | 2000-10-30 | 2004-04-13 | Koninklijke Philips Electronics N.V. | Environment-responsive user interface/entertainment device that simulates personal interaction |
-
2005
- 2005-01-25 JP JP2005017468A patent/JP4603373B2/ja not_active Expired - Fee Related
- 2005-01-26 WO PCT/JP2005/000989 patent/WO2005071616A1/ja active Application Filing
- 2005-01-26 EP EP05704126A patent/EP1703468A1/en not_active Withdrawn
- 2005-01-26 US US10/597,464 patent/US7797330B2/en not_active Expired - Fee Related
- 2005-01-26 KR KR1020067015171A patent/KR20060128962A/ko not_active Application Discontinuation
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04264972A (ja) * | 1991-02-20 | 1992-09-21 | Sharp Corp | 自然言語処理装置及びそれを利用した動画形成装置 |
JPH08263681A (ja) * | 1995-03-22 | 1996-10-11 | Matsushita Electric Ind Co Ltd | アニメーション作成装置およびその方法 |
JPH09167251A (ja) * | 1995-12-14 | 1997-06-24 | Canon Inc | アニメーション生成装置及びその方法 |
JPH09167165A (ja) * | 1995-12-15 | 1997-06-24 | Matsushita Electric Ind Co Ltd | 動画像生成方法 |
JP2000331182A (ja) * | 1999-05-21 | 2000-11-30 | Fujitsu Ltd | アニメーション編集装置とアニメーション再生装置とプログラム記録媒体 |
JP2001243221A (ja) * | 2000-03-02 | 2001-09-07 | Takuo Kitamura | 電子機器の文字入力方法と文字入力システム |
JP2002108875A (ja) * | 2000-10-04 | 2002-04-12 | Soatec Inc | 電子マニュアル装置及び電子マニュアルの変更方法 |
Also Published As
Publication number | Publication date |
---|---|
EP1703468A1 (en) | 2006-09-20 |
US7797330B2 (en) | 2010-09-14 |
US20080228713A1 (en) | 2008-09-18 |
KR20060128962A (ko) | 2006-12-14 |
JP2005251174A (ja) | 2005-09-15 |
JP4603373B2 (ja) | 2010-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8856163B2 (en) | System and method for providing a user interface with search query broadening | |
JP4622589B2 (ja) | 情報処理装置および方法、プログラム、並びに記録媒体 | |
US9092173B1 (en) | Reviewing and editing word processing documents | |
WO2005071616A1 (ja) | 画像作成装置および画像作成方法 | |
JPH0991314A (ja) | 情報探索装置 | |
US10698917B2 (en) | Managing electronic slide decks | |
US10656814B2 (en) | Managing electronic documents | |
US11372873B2 (en) | Managing electronic slide decks | |
EP1212697A1 (en) | Method and apparatus for building a user-defined technical thesaurus using on-line databases | |
JP2023519816A (ja) | データ品質を自動的に改善すること | |
JP2023521558A (ja) | デザイン空間の自動的かつ知的な探索 | |
JP2010086455A (ja) | 検索条件指定装置、検索条件指定方法及びプログラム | |
RU2698405C2 (ru) | Способ поиска в базе данных | |
JP3356519B2 (ja) | 文書情報検索装置 | |
JP2006350477A (ja) | ファイル管理装置及びその制御方法、並びに、コンピュータプログラム及びコンピュータ可読記憶媒体 | |
JPH09231238A (ja) | テキスト検索結果表示方法及び装置 | |
JP2007316743A (ja) | 部分文書検索プログラム、部分文書検索方法および部分文書検索装置 | |
CN110297965B (zh) | 课件页面的显示及页面集的构造方法、装置、设备和介质 | |
JP2007279978A (ja) | 文書検索装置及び文書検索方法 | |
KR101835994B1 (ko) | 키워드 맵을 이용한 전자책 검색 서비스 제공 방법 및 장치 | |
JPH1145252A (ja) | 情報検索装置およびその装置としてコンピュータを機能させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体 | |
JP2021120790A (ja) | 文章構造描画装置 | |
Caenen et al. | Show me what you mean! PARISS: A CBIR-interface that learns by example | |
JP7441576B1 (ja) | 情報処理システム、情報処理方法及びプログラム | |
JP6979738B1 (ja) | サーバおよびアニメーション推薦システム、アニメーション推薦方法、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005704126 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067015171 Country of ref document: KR Ref document number: 200580003380.3 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2005704126 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10597464 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 1020067015171 Country of ref document: KR |