WO2018026082A1 - Dispositif et procédé de création d'une animation - Google Patents

Dispositif et procédé de création d'une animation Download PDF

Info

Publication number
WO2018026082A1
WO2018026082A1 PCT/KR2017/001844 KR2017001844W WO2018026082A1 WO 2018026082 A1 WO2018026082 A1 WO 2018026082A1 KR 2017001844 W KR2017001844 W KR 2017001844W WO 2018026082 A1 WO2018026082 A1 WO 2018026082A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
character
facial expression
animation
expression
Prior art date
Application number
PCT/KR2017/001844
Other languages
English (en)
Korean (ko)
Inventor
최진성
Original Assignee
주식회사 씨투몬스터
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 씨투몬스터 filed Critical 주식회사 씨투몬스터
Publication of WO2018026082A1 publication Critical patent/WO2018026082A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2213/00Indexing scheme for animation
    • G06T2213/12Rule based animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts

Definitions

  • the present invention relates to an apparatus and method for producing animation.
  • An embodiment of the present invention is to provide an animation production apparatus and method that allows an individual to directly produce an animation with a terminal such as a smartphone or tablet PC.
  • An embodiment of the present invention is to provide an animation production apparatus and method that can easily and simply implement the motion and facial expression of the character when desired.
  • the storage unit may store an action item defining an action of the character.
  • the processor may load an action item selected by a user from at least one action item stored in the storage, and control the character to move the character according to an action defined in the called action item.
  • the processor is configured to: configure a list of action items in response to a user's command to select an action item to be used for the production of the animation among a plurality of action items stored in the storage, and after recording of the animation is started, the action item list.
  • the action item is selected by the user in the user can control the character to move the character according to the action defined in the action item.
  • the processing unit when the user selects the action item and the user newly selects the action item with a second command while the character is moving, after the movement of the character ends, according to the action defined in the newly selected action item.
  • the character may be controlled to move the character.
  • the storage unit may store an expression item defining an expression of the character.
  • the processor may load the facial expression item selected by the user from at least one facial expression item stored in the storage, and control the character such that the facial expression of the character is formed according to the facial expression defined in the loaded facial expression item.
  • the processing unit constructs a facial expression item list in response to a user's command of selecting a facial expression item to be used for producing the animation from among a plurality of facial expression items stored in the storage unit, and after recording of the animation starts, the facial expression item list
  • the character may be controlled to form the facial expression of the character according to the expression defined in the facial expression item.
  • the processing unit when the user selects at least one facial expression item from the facial expression item list, the character is formed such that the facial expression of the character is formed of a synthetic expression obtained by combining a preset default facial expression and the expression defined in the selected facial expression item; Can be controlled.
  • the display unit may display the facial expression item list in which an icon of the facial expression item is arranged along a circumference of a preset figure, and the input unit may receive a command for indicating a point in the figure from a user.
  • the basic expression and the expression defined in the selected facial expression item may be synthesized based on the distance between the point indicated by the user and the icon of the facial expression item.
  • the processor may include: a weighted average of basic coordinate values of a face component preset to implement the basic facial expression and coordinate values of the facial component matched to the facial expression item to implement the facial expression corresponding to the facial expression item.
  • the expression defined in the basic expression and the selected facial expression item may be synthesized.
  • the processor may be configured to: select a preset number of facial expression items in a close order to the point from among the facial expression items included in the facial expression item list, the reference distance between the point and the reference point in the figure, and the point of the selected facial expression item.
  • the distance between the icons is calculated, a reference weight based on the reference distance is assigned to the basic coordinate value of the face component, and the coordinate value of the face component matched to the selected facial expression item is based on the distance.
  • Coordinates of the face component to implement the composite expression by assigning a weighted value and calculating an average of the coordinate values of the face component to which the reference weight is assigned and the coordinate value of the weighted face component. Can be output as a value.
  • Animation production method the step of receiving a command for the production of the animation from the user; Retrieving an item prepared in advance for use in the production of the animation according to a user's command; And applying the retrieved item to a character appearing in the animation.
  • Importing the item includes: importing an action item selected by a user from at least one action item defining the action of the character, and applying the called item to a character: defining the imported action item And controlling the character to move the character according to a predetermined motion.
  • the animation production method may include: receiving a first command of selecting an action item from a user while the action item is selected and the character is moving; And canceling the movement of the character in response to the first command and controlling the character to move the character according to the movement defined in the newly selected movement item.
  • the animation production method includes: receiving a command of a user to select a facial expression item to be used for producing the animation from a plurality of facial expression items; And constructing a facial expression item list in response to a user's command, wherein receiving a command from the user includes: selecting a facial expression item from a facial expression item in the facial expression item list after recording of the animation starts; Receiving the item, the step of loading the item: Recalling a facial expression item selected by the user in the facial expression item list, Applying the retrieved item to the character: Defines the corresponding facial expression item And controlling the character to form an expression of the character according to the expression.
  • the controlling of the character such that the expression of the character is formed according to the expression defined in the corresponding facial expression item may include: When the user selects at least one facial expression item from the facial expression item list, the facial expression of the character is set to a preset basic facial expression. And controlling the character to be formed as a synthetic facial expression synthesized with the facial expression defined in the selected facial expression item.
  • Synthesizing a basic expression and a facial expression defined in the selected facial expression item based on the distance may include: implementing a basic coordinate value of a face component previously set to implement the basic facial expression and the facial expression corresponding to the facial expression item. And weighting averaging coordinate values of the face component matched with the facial expression item to synthesize the basic facial expression and the facial expression defined in the selected facial expression item.
  • Synthesizing the basic expression and the expression defined in the selected facial expression item by weighted average includes: selecting a predetermined number of facial expression items in an order close to the point among facial expression items included in the facial expression item list; Calculating a reference distance between the point and a reference point in the figure and a distance between the point and an icon of the selected facial expression item; Assigning a reference weight based on the reference distance to the basic coordinate value of the face component, and assigning a weight based on the distance to the coordinate value of the face component matched to the selected facial expression item; And calculating a mean of the basic coordinate values of the face component to which the reference weight is assigned and the coordinate values of the face component to which the weight is assigned and outputting the coordinate values of the face component to implement the composite expression. It may include a step.
  • the reference weighting and weighting may include: assigning the reference weight to the basic coordinate value of the face component as the reference distance is closer, and assigning the weight to the coordinate value of the face component as the distance is closer. It may include the step of giving a high.
  • an individual can directly produce an animation using a smartphone, a tablet PC, a personal computer, or the like without having expensive equipment or an advanced program.
  • FIG. 2 is an exemplary diagram for describing a process of producing an animation according to an embodiment of the present invention.
  • FIG. 5 is an exemplary diagram for describing a process of constructing an action item list according to an embodiment of the present invention.
  • FIG. 7 and 8 are exemplary diagrams for explaining a process of implementing the expression of the character according to an embodiment of the present invention.
  • FIG. 9 is an exemplary diagram for explaining a process of constructing a facial expression item list according to an embodiment of the present invention.
  • FIG. 10 is an exemplary diagram showing a facial expression item list according to another embodiment of the present invention.
  • FIG. 14 is an exemplary flowchart of an animation production method according to an embodiment of the present invention.
  • 15 is an exemplary flowchart for explaining a process of implementing an operation of a character according to an embodiment of the present invention.
  • 16 is an exemplary flowchart for describing a process of implementing an operation of a character according to another embodiment of the present invention.
  • 17 is an exemplary flowchart for describing a process of implementing an operation of a character according to another embodiment of the present invention.
  • 19 is an exemplary flowchart for describing a process of implementing an expression of a character according to another embodiment of the present invention.
  • the terms ' ⁇ ', ' ⁇ ', ' ⁇ block', ' ⁇ module', etc. used throughout the present specification may mean a unit for processing at least one function or operation.
  • it can mean a hardware component such as software, FPGA, or ASIC.
  • ' ⁇ ', ' ⁇ ', ' ⁇ block', ' ⁇ module', etc. are not limited to software or hardware.
  • ' ⁇ ', ' ⁇ ', ' ⁇ ', ' ⁇ ' May be configured to reside in an addressable storage medium or may be configured to play one or more processors.
  • ' ⁇ ', ' ⁇ ', ' ⁇ block', ' ⁇ module' are components such as software components, object-oriented software components, class components, and task components. And processes, functions, properties, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays, and Contains variables
  • the components and the functions provided within ' ⁇ ', ' ⁇ ', ' ⁇ ', ' ⁇ ', ',' ⁇ Module 'or may be further separated into additional components and' ⁇ part ',' ⁇ group ',' ⁇ block ',' ⁇ module '.
  • FIG. 1 is an exemplary block diagram of an animation production apparatus 100 according to an embodiment of the present invention.
  • the animation production apparatus 100 includes an input unit 110, a storage unit 120, a processing unit 130, and a display unit 140.
  • the animation production apparatus 100 is a computing device including a processor and a memory, and may be, for example, a smart device (eg, a smartphone, a tablet PC, etc.).
  • the animation producing apparatus 100 is not limited to a smart device, and may be any computer (eg, a personal computer) as long as it includes a processor and a memory.
  • the input unit 110 receives a command for producing an animation from a user.
  • the input unit 110 is an input device for receiving data, and may include, for example, a touch pad, a keyboard, a mouse, a touch pen, and the like. If the input unit 110 is a touch pad and the touch pad is combined with the display unit 140, the input unit 110 and the display unit 140 may constitute a touch screen.
  • the storage unit 120 stores an item prepared in advance for use in producing an animation.
  • the storage unit 120 is a storage device that stores data.
  • the storage unit 120 may include an HDD, an SSD, a RAM, a ROM, a cache, a register, and the like.
  • the processor 130 loads the item according to a user's command and applies it to a character appearing in an animation.
  • the processor 130 is a processor that processes data, and may include, for example, a CPU, a GPU, or an AP.
  • the display unit 140 displays an animation.
  • the display unit 140 is a display device that displays data on a screen.
  • the display unit 140 may include an LCD.
  • the animation production apparatus 100 may further include a communication unit 150.
  • the communication unit 150 may transmit the produced animation to another terminal.
  • the communication unit 150 is a communication device for transmitting and receiving data.
  • the communication unit 150 may include a wireless data communication module, a wireless LAN module, a short range communication module, and the like, but is not limited thereto and may include a wired data communication module.
  • FIG. 2 is an exemplary diagram for describing a process of producing an animation according to an embodiment of the present invention.
  • the animation production apparatus 100 displays the character 200 appearing in the animation on the display unit 140.
  • Data relating to the character 200 may be prepared in advance for use in the production of animation and stored in the storage unit 120.
  • the user may select and call a desired character from the character list stored in the storage 120 to produce an animation.
  • a user may directly generate and edit the character 200 using the animation production apparatus 100.
  • the user may customize the character by selecting and bringing out desired ones from the face, the body, the costume, the accessories, etc. of the character 200 provided in advance and stored in the storage unit 120.
  • the user may directly produce an animation of the character 200 by the character 200 by directly implementing the operation and expression of the character 200 using the animation production apparatus 100.
  • 3 and 4 are exemplary diagrams for explaining a process of implementing the operation of the character 200 according to an embodiment of the present invention.
  • the storage unit 120 may store an action item defining an action of the character 200.
  • the processor 130 loads an action item selected by the user from at least one action item stored in the storage 120, and moves the character 200 to move according to the action defined in the loaded action item. 200 can be controlled.
  • the animation production apparatus 100 may display the icon 210 of at least one operation item stored in the storage 120 on the display 140.
  • the animation production apparatus 100 may further display a recording button 201 for starting and ending the recording of the animation on the display unit 140.
  • the user may select an icon of the action item to be applied to the character 200 from the icon 210 of the action item displayed on the display unit 140.
  • the processor 130 loads the action item selected by the user from the storage unit 120 and according to the action defined in the loaded action item, the character 200.
  • the character 200 may be controlled to move.
  • the character 200 moving according to the motion defined in the motion item may be displayed on the display unit 140.
  • the motion item is data defining motion of the character 200 and may implement at least one of a pose, which is a stationary pose of the character 200, and a motion moving for a predetermined time.
  • FIG. 5 is an exemplary diagram for explaining a process of constructing an action item list 210 according to an embodiment of the present invention.
  • the processor 130 may construct an action item list in response to a user's command of selecting an action item to be used for making an animation from among a plurality of action items stored in the storage 120. Can be. After the recording of the animation starts, the processor 130 controls the character 200 to move the character 200 according to the defined motion of the action item when an action item is selected by the user in the action item list. can do.
  • the animation production apparatus 100 may display a plurality of operation items 211 stored in the storage 120 on the display unit 140.
  • the user may select an action item to be used for making an animation from among the action items 211 and add it to the action item list 210.
  • the user may press the confirmation button 202 to complete the configuration of the action item list 210.
  • the user may implement the action of the character by selecting the action item from the action item list 210 as shown in FIG. 4.
  • the processor 130 selects an action item and moves the character 200 when the user newly selects the action item with the first command while the character 200 moves.
  • the character 200 may be controlled to cancel and move the character 200 according to the motion defined in the newly selected action item.
  • the user when the user selects one action item from the action item list 210 and the character 200 is moving, the user may touch an icon of another action item in the action item list 210. Can be.
  • the processor 130 may cancel the current movement of the character 200 when the user touches an icon of another action item and resume the movement of the character 200 according to the action defined in the newly selected action item.
  • the first command for the user to newly select the action item is an input of touching an icon of the action item.
  • the processing unit 130 when the user selects a new operation item with the second command while the user selects the operation item and the character 200 moves, the processing unit 130 may be different from the above embodiment. After the movement of the 200 is finished, the character 200 may be controlled to move the character 200 according to the motion defined in the newly selected action item.
  • the user when the user selects one action item from the action item list 210 and the character 200 is moving, as shown in FIG. 4, the user double-taps an icon of another action item in the action item list 210. Or touch and hold.
  • the processor 130 continues the current movement of the character 200 unlike when the icon is touched, and ends the movement of the character 200 to the newly selected action item.
  • the movement of the character 200 may be implemented according to the defined motion.
  • the second command for the user to newly select an action item is an input of double tapping or long touching the icon of the action item.
  • the user may select the action items to be applied to the character 200 from the action item list 210 as the first command or the second command to implement the action of the character 200 appearing in the animation as desired.
  • FIG. 6 is an exemplary diagram illustrating a state of playing an animation in which an operation of the character 200 is implemented according to an embodiment of the present invention.
  • the animation production apparatus 100 may display an animation control button 203 on the display unit 140.
  • the user may play or pause an animation produced by the user by manipulating the animation control button 203 displayed on the display unit 140.
  • the animation production apparatus 100 may further display the progressive bar 204 on the display unit 140.
  • the user may adjust the playback point of the animation by manipulating the progressive bar 204 displayed on the display unit 140.
  • the user can freely implement an operation of the character 200 appearing in the animation by selecting a desired action item from at least one action item provided by the animation production apparatus 100.
  • FIG. 7 and 8 are exemplary diagrams for explaining a process of implementing the expression of the character 200 according to an embodiment of the present invention.
  • the storage unit 120 may store an expression item defining an expression of the character 200.
  • the processor 130 loads the facial expression item selected by the user among at least one facial expression item stored in the storage 120, and the facial expression of the character 200 is formed according to an operation defined in the loaded facial expression item.
  • the character 200 can be controlled to be.
  • the animation production apparatus 100 may display the icon 220 of at least one facial expression item stored in the storage 120 on the display 140.
  • the animation production apparatus 100 may further display a recording button 201 to start and end the recording of the animation on the display unit 140.
  • the user may select an icon of the facial expression item to be applied to the character 200 from the icon 220 of the facial expression item displayed on the display unit 140.
  • the processor 130 loads the facial expression item selected by the user from the storage unit 120 and according to the facial expression defined in the loaded facial expression item 200.
  • the character 200 may be controlled to have an expression of.
  • the character 200 having an expression formed according to the expression defined in the expression item may be displayed on the display unit 140.
  • the facial expression item is data defining the facial expression of the character 200 and includes values necessary to implement the facial appearance of the character 200.
  • the facial expression item may include a coordinate value indicating the position of the facial skeleton of the character 200.
  • the face of the character 200 may be implemented by one or more facial bones.
  • the facial expression item may define the facial expression of the character 200 including the position of the facial bone of the character 200, for example, a coordinate value at which the end of the facial bone is located.
  • the facial expression item may include a coordinate value indicating a position of a predetermined point on the face of the character 200.
  • the facial expression of the character 200 may be formed by positions of main parts such as eyes, nose, and mouth in addition to facial bones.
  • the facial expression item is a position of the main part of the face of the character 200, for example, a coordinate value at which both ends of the eye are positioned, a coordinate value at which the ends of the nose are positioned, and coordinates at which both ends of the mouth are located.
  • the expression of the character 200 may be defined including a value.
  • FIG. 9 is an exemplary diagram for describing a process of constructing the facial expression item list 220 according to an embodiment of the present invention.
  • the processing unit 130 may construct a facial expression item list in response to a user's command of selecting a facial expression item to be used for making an animation from a plurality of facial expression items stored in the storage unit 120. Can be. After the recording of the animation starts, the processing unit 130 selects the facial expression item by the user from the facial expression item list, so that the facial expression of the character 200 is formed according to the expression defined in the facial expression item. Can be controlled.
  • the animation production apparatus 100 may display a plurality of facial expression items 221 stored in the storage 120 on the display 140.
  • the user may select an expression item to be used for the production of the animation from the plurality of expression items 221 and add it to the expression item list 220.
  • the user may press the confirmation button 202 to complete the configuration of the facial expression item list 220.
  • the user may implement the facial expression of the character by selecting the facial expression item from the facial expression item list 220 as shown in FIG. 8.
  • the processor 130 synthesizes a facial expression of the character 200 and a preset facial expression defined in the selected facial expression item.
  • the character 200 may be controlled to be formed as a synthetic expression.
  • FIG. 10 is an exemplary diagram showing a facial expression item list 220 according to another embodiment of the present invention.
  • the display unit 140 may display a facial expression item list 220 in which icons 221 to 226 of facial expression items are arranged along a circumference of a preset figure.
  • the facial expression item list 220 is configured such that six facial expression items icons 221 to 226 are arranged at equal intervals along the circumference of the circle. It is not limited.
  • the input unit 110 may receive a command for indicating a point in the figure from a user.
  • the display unit 140 may further display a pointer 205 in a figure in which icons 221 to 226 of facial expression items are arranged around the display unit 140.
  • the user may indicate a point in the figure by moving the pointer 205 by dragging or the like.
  • the processor 130 may synthesize the expression defined in the basic expression and the selected expression item based on the distance between the point indicated by the user and the icons 221 to 226 of the expression item.
  • 11 and 12 are exemplary diagrams for describing a process of implementing the expression of the character 200 using the facial expression item list 220 according to another embodiment of the present invention.
  • the processor 130 may change the expression of the character 200.
  • the character 200 may be controlled to have a smiley expression defined in one facial expression item.
  • the expression of the character 200 is determined according to the position of the pointer 205 in the figure, and more specifically, the basic expression and the expression are based on the distance between the position of the pointer 205 and the icons 221 to 226 of the expression item.
  • the expression defined in the icons 221 to 226 of the expression item may be implemented by synthesizing.
  • the expression of the character 200 is a preset basic expression (eg, no expression, an open mouth, etc.). May be).
  • the facial expression of the character 200 may be implemented as a synthetic facial expression in which the basic facial expression and the facial expression defined in the selected facial expression item are synthesized according to the position of the pointer 205 in the figure.
  • the processor 130 may add the basic coordinate values of the face components set in advance to implement the basic facial expressions and the facial expression items to implement the facial expressions corresponding to the facial expression items.
  • the coordinates of the matched face components may be weighted average to synthesize the basic expression and the expression defined in the selected facial expression item. That is, the processor 130 synthesizes the basic facial expressions and the facial expressions of the facial expression items by weighted average of the facial component coordinates of the basic facial expressions and the facial component coordinates of the facial expression items.
  • the facial component may include elements used to implement the facial appearance of the character 200, for example, the aforementioned facial bones, main parts of the face, and the like.
  • the processing unit 130 may be preset in order of being close to the point indicated by the user (ie, the position P of the pointer 205) among the facial expression items included in the facial expression item list 220. A number of facial expression items can be selected.
  • the processing unit 130 is second to the facial expression item (the first facial expression item 221 in FIG. 12) closest to the pointer 205 among the first to sixth facial expression items 221 to 226. It is possible to select a near facial expression item (second facial expression item 222 in FIG. 12), that is, two facial expression items.
  • the processor 130 may determine a reference distance d 0 between the point P and a reference point in the figure (for example, the center C 0 of the figure) and a distance d between the point P and the icons 221 and 222 of the selected facial expression item. 1 , d 2 can be calculated.
  • the processor 130 assigns a reference weight based on the reference distance d 0 to the basic coordinate value of the face component, and distances the coordinate values of the face component matched to the selected facial expression items 221 and 222.
  • a weight may be assigned based on d 1 and d 2 .
  • the processor 130 calculates an average of the coordinate values of the reference component weighted face component and the weighted face component as the coordinate values of the face component to implement a composite expression. You can print
  • the processor 130 may give a higher reference weight to the basic coordinate value of the face component.
  • the processor 130 may give a higher weight to the coordinate values of the facial component as distances d 1 and d 2 become closer.
  • the facial expression of the character 200 in FIG. 12 is the first facial expression item 221 among the first to sixth facial expression items 221 to 226 included in the facial expression item list 220.
  • the defined smiley expression is most reflected, and the pair defined in the second facial expression item 222 closest to the pointer 205 may be formed to partially reflect the facial expression.
  • FIG. 13 is an exemplary diagram illustrating playing an animation in which an expression of the character 200 is implemented according to an embodiment of the present invention.
  • the animation production apparatus 100 may display the animation control button 203 and the progressive bar 204 on the display unit 140.
  • the user may play or pause an animation produced by the user by manipulating the animation control button 203 displayed on the display unit 140, and adjust the playback point of the animation by manipulating the progressive bar 204.
  • the character 200 makes an expression implemented as described above. Furthermore, in the case where the expression of the character 200 is implemented after the operation of the character 200 is implemented, the character 200 may represent both the operation and the expression implemented by the user.
  • the user can freely implement the expression of the character 200 appearing in the animation by selecting a desired expression item from at least one expression item provided by the animation production apparatus 100. Furthermore, the user can freely synthesize the facial expressions of the facial expression item and apply the character 200 to the character 200, thereby enabling the expression of the character 200 to be more variously and dynamically implemented.
  • FIG. 14 is an exemplary flowchart of an animation production method 1000 according to an embodiment of the present invention.
  • the animation production method 1000 may be executed by the animation production apparatus 100 according to the embodiment of the present invention described above.
  • a step of receiving a command for producing an animation from a user is input (S1100), and a step of loading an item prepared in advance to be used for producing an animation according to a user's command (S1200). And applying the loaded item to the character 200 appearing in the animation (S1300).
  • 15 is an exemplary flowchart for explaining a process of implementing an operation of the character 200 according to an embodiment of the present invention.
  • the step of loading an item may include a step of loading an action item selected by a user among at least one action item defining an action of the character 200.
  • applying the loaded item to the character 200 may include controlling the character 200 to move the character 200 according to the motion defined in the loaded motion item (S1310). have.
  • the animation production method 1000 includes receiving a command of a user to select an action item to be used for producing an animation from a plurality of action items, and a list of action items in response to the user's command
  • the method may further include configuring 210.
  • step S1100 of receiving a command from the user may include receiving a command of selecting an action item from the user in the action item list 210 after recording of the animation starts.
  • step S1200 of importing an item may include loading an action item selected by the user from the action item list 210.
  • the loaded item may be applied to the character 200.
  • the method may include controlling the character 200 to move the character 200 according to the motion defined in the corresponding motion item.
  • 16 is an exemplary flowchart for describing a process of implementing an operation of the character 200 according to another embodiment of the present invention.
  • the animation production method 1000 receives a first command (eg, touching an icon of an action item) from a user to newly select an action item while the action item 200 is selected and the character 200 moves. Canceling the movement of the character 200 in response to the first command and controlling the character 200 to move the character 200 according to the movement defined in the newly selected movement item (S1312). It may further include.
  • a first command eg, touching an icon of an action item
  • 17 is an exemplary flowchart for explaining a process of implementing an operation of the character 200 according to another embodiment of the present invention.
  • a second command eg, double tap or long touch an icon of an action item
  • Step S1313 controlling the character 200 to move the character 200 according to the movement defined in the newly selected action item.
  • Step S1314 may be further included.
  • FIG. 18 is an exemplary flowchart for describing a process of implementing an expression of the character 200 according to an embodiment of the present invention.
  • loading an item in operation S1200 may include loading an expression item selected by a user among at least one facial expression item defining an expression of the character 200 in operation S1220. can do.
  • the step of applying the loaded item to the character 200 (S1300) includes controlling the character 200 to form an expression of the character 200 according to the expression defined in the loaded expression item (S1320). can do.
  • the animation production method 1000 may include receiving a command of a user to select a facial expression item to be used for producing an animation from a plurality of facial expression items, and a facial expression item list in response to the user's command.
  • the method may further include configuring 220.
  • 19 is an exemplary flowchart for describing a process of implementing an expression of the character 200 according to another embodiment of the present invention.
  • Step S1322 a reference weight based on the reference distance d 0 is given to the basic coordinate value of the face component, and the coordinate values of the face component matched to the selected facial expression item are based on the distance d 1 and d 2 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif et un procédé de création d'une animation. Le dispositif de création d'une animation, selon un mode de réalisation de la présente invention, peut comprendre : une unité d'entrée pour recevoir une commande associée à la création d'une animation par un utilisateur; une unité de stockage pour stocker un article préparé à l'avance pour être utilisé dans la création de l'animation; une unité de traitement pour, selon la commande de l'utilisateur, importer l'article et l'appliquer à un personnage apparaissant dans l'animation; et une unité d'affichage pour afficher l'animation.
PCT/KR2017/001844 2016-08-02 2017-02-20 Dispositif et procédé de création d'une animation WO2018026082A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160098357A KR101809601B1 (ko) 2016-08-02 2016-08-02 애니메이션 제작 장치 및 방법
KR10-2016-0098357 2016-08-02

Publications (1)

Publication Number Publication Date
WO2018026082A1 true WO2018026082A1 (fr) 2018-02-08

Family

ID=60954423

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/001844 WO2018026082A1 (fr) 2016-08-02 2017-02-20 Dispositif et procédé de création d'une animation

Country Status (2)

Country Link
KR (1) KR101809601B1 (fr)
WO (1) WO2018026082A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102181410B1 (ko) * 2017-12-18 2020-11-23 김은아 도형표를 활용한 캐릭터 생성장치 및 그 생성방법
KR20200039965A (ko) * 2018-10-08 2020-04-17 주식회사 해피업 유아용 인터렉티브 교육 콘텐츠 프로그램이 탑재된 전자장치
KR102338501B1 (ko) * 2020-03-06 2021-12-13 주식회사 어라운드이펙트 텍스트 기반의 애니메이션 데이터 자동 배치 방법 및 그를 위한 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100112335A (ko) * 2009-04-09 2010-10-19 삼성전자주식회사 비디오 기반 얼굴 애니메이션 생성 장치 및 방법
KR20110045719A (ko) * 2009-10-27 2011-05-04 주식회사 숀픽쳐스 애니메이션 제작방법과 이 제작방법을 실행하기 위한 프로그램이 저장된 컴퓨터로 판독할 수 있는 기록매체 및 이를 이용한 온라인상에서의 애니메이션 제작 시스템
KR20110056664A (ko) * 2009-11-23 2011-05-31 삼성전자주식회사 이동통신 단말기에서 영상 통화 방법 및 장치
KR20130087311A (ko) * 2012-01-27 2013-08-06 엔에이치엔아츠(주) 모바일 환경에서 제공되는 서비스에서 아바타를 제공하는 아바타 서비스 시스템 및 방법
JP2015125636A (ja) * 2013-12-26 2015-07-06 Kddi株式会社 感情表現装置、感情表現方法およびコンピュータプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100112335A (ko) * 2009-04-09 2010-10-19 삼성전자주식회사 비디오 기반 얼굴 애니메이션 생성 장치 및 방법
KR20110045719A (ko) * 2009-10-27 2011-05-04 주식회사 숀픽쳐스 애니메이션 제작방법과 이 제작방법을 실행하기 위한 프로그램이 저장된 컴퓨터로 판독할 수 있는 기록매체 및 이를 이용한 온라인상에서의 애니메이션 제작 시스템
KR20110056664A (ko) * 2009-11-23 2011-05-31 삼성전자주식회사 이동통신 단말기에서 영상 통화 방법 및 장치
KR20130087311A (ko) * 2012-01-27 2013-08-06 엔에이치엔아츠(주) 모바일 환경에서 제공되는 서비스에서 아바타를 제공하는 아바타 서비스 시스템 및 방법
JP2015125636A (ja) * 2013-12-26 2015-07-06 Kddi株式会社 感情表現装置、感情表現方法およびコンピュータプログラム

Also Published As

Publication number Publication date
KR101809601B1 (ko) 2017-12-15

Similar Documents

Publication Publication Date Title
WO2015167160A1 (fr) Procédé d'affichage d'instruction et dispositif d'affichage d'instruction
WO2017104987A1 (fr) Dispositif de photographie et son procédé de commande
WO2013065929A1 (fr) Télécommande et son procédé de fonctionnement
WO2017140079A1 (fr) Procédé et appareil de commande d'interactions pour réalité virtuelle
WO2019139270A1 (fr) Dispositif d'affichage et procédé de fourniture de contenu associé
WO2018026082A1 (fr) Dispositif et procédé de création d'une animation
WO2022158820A1 (fr) Systèmes et procédés pour manipuler des vues et des objets partagés dans un espace xr
WO2021242005A1 (fr) Dispositif électronique et procédé de génération d'autocollant d'émoji basés sur un avatar d'utilisateur
WO2021133053A1 (fr) Dispositif électronique et son procédé de commande
WO2020184935A1 (fr) Appareil électronique et procédé de commande associé
WO2015012607A1 (fr) Procédé d'affichage et dispositif électronique associé
WO2021251632A1 (fr) Dispositif d'affichage pour générer un contenu multimédia et procédé de mise en fonctionnement du dispositif d'affichage
WO2021075705A1 (fr) Dispositif électronique et son procédé de commande
WO2015122725A1 (fr) Procédé et appareil pour créer un groupe de communication
WO2016122153A1 (fr) Appareil d'affichage et son procédé de commande
WO2020027417A1 (fr) Dispositif électronique et procédé pour fournir un outil de saisie virtuelle
WO2022215823A1 (fr) Procédé et dispositif de génération d'image
WO2020242064A1 (fr) Dispositif mobile, et procédé de commande de dispositif mobile
WO2023095952A1 (fr) Procédé de génération automatique d'écran et dispositif associé
WO2021118267A1 (fr) Dispositif électronique et procédé de commande associé
WO2020036323A1 (fr) Appareil électronique, son procédé de commande et système électronique
WO2013172522A1 (fr) Terminal pouvant composer un message texte et procédé de commande
WO2013065912A1 (fr) Procédé, terminal et support d'enregistrement pour commander une sortie d'écran
WO2020204572A1 (fr) Dispositif électronique et son procédé de commande
WO2019198913A1 (fr) Dispositif électronique, et procédé de commande associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17837134

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17837134

Country of ref document: EP

Kind code of ref document: A1