CN111105482B - Animation system, method and computer-readable storage medium - Google Patents

Animation system, method and computer-readable storage medium Download PDF

Info

Publication number
CN111105482B
CN111105482B CN201911346616.3A CN201911346616A CN111105482B CN 111105482 B CN111105482 B CN 111105482B CN 201911346616 A CN201911346616 A CN 201911346616A CN 111105482 B CN111105482 B CN 111105482B
Authority
CN
China
Prior art keywords
behavior
cluster
unit
animation
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911346616.3A
Other languages
Chinese (zh)
Other versions
CN111105482A (en
Inventor
王弘艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Lilith Technology Corp
Original Assignee
Shanghai Lilith Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Lilith Technology Corp filed Critical Shanghai Lilith Technology Corp
Priority to CN201911346616.3A priority Critical patent/CN111105482B/en
Publication of CN111105482A publication Critical patent/CN111105482A/en
Application granted granted Critical
Publication of CN111105482B publication Critical patent/CN111105482B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention provides an animation production system, an animation production method and a computer readable storage medium, wherein the animation production system comprises an animation rendering engine and a time line editor, wherein the animation rendering engine forms a behavior instruction, and the time line editor comprises a classification unit, an action giving unit and a control unit; the classification unit gathers the out-of-the-field objects with the same behavior similarity according to the behavior similarity of the out-of-the-field objects; the action giving unit gives the object behavior to each cluster, so that the presence objects collected into the same cluster have the same behavior; the control unit is internally provided with spontaneous behavior instructions, and when the execution of the outgoing objects in any cluster is completed, the outgoing objects receive the spontaneous behavior instructions and execute the spontaneous behavior instructions. By adopting the technical scheme, the animation production efficiency can be improved, the behavior control load pressure of the object appearing in the animation can be saved, and the expression form of the animation is further enriched.

Description

Animation system, method and computer-readable storage medium
Technical Field
The present invention relates to the field of animation, and in particular, to an animation system, method, and computer-readable storage medium.
Background
With the abundance of physical lives, the pursuit of mental substances is increasingly demanded. Wherein, the culture entertainment carrier in the form of animation becomes the choice of a plurality of people. Current animation is mostly based on editing of a timeline, which is an abstract representation that simulates the time lapse of an animation. As shown in fig. 1 and 2, after the animation starts to be played in the timeline editor, the timeline moves from left to right, representing the lapse of time. During animation, tracks can be added in the editor to control the characters and resources that appear in the animation. The specific control mode is to set command bars on the track. When the timeline passes a certain command bar of the track, the command is executed. Thus, specific roles, special effects, etc. elements can be created at specific points in time of the timeline. And various actions of the character, such as moving, playing bone animation and the like, can be controlled at a specific position of the time line. This editing mode requires setting a track for a character exhibiting these behaviors and instruction setting at a specific position of the track.
This approach is very flexible and efficient if the number of roles is not large. However, in order to enrich the expression form of the animation, the number of characters appearing in the animation is very large, such as the scene of a large war, the picture of the appearance of multiple species, etc., then in design, in order to control the behavior of each character, as shown in fig. 3, a considerable number of tracks need to be created correspondingly, and the set time of each behavior is calculated for each character. And considering that the roles may also affect each other, the design needs to anticipate the effect that the roles may have and make proper design, so that the design difficulty is great.
Therefore, a new animation production system and method is needed to save the production time of the animation, and the large-scale animation can be completed only by simple operation.
Disclosure of Invention
In order to overcome the technical defects, the invention aims to provide an animation production system, an animation production method and a computer readable storage medium, which can improve the production efficiency of the animation, save the behavior control load pressure of objects appearing in the animation and further enrich the expression form of the animation.
The invention discloses an animation production system, which comprises an animation rendering engine and a time line editor, wherein the animation rendering engine forms a behavior instruction to control an outgoing object in an animation,
the time line editor relies on a time line track of an animation and comprises a classification unit, an action giving unit and a control unit;
the classification unit gathers the out-going objects with the same behavior similarity to the first cluster and the second cluster according to the behavior similarity of the out-going objects. . . An nth cluster, where n ε Z+;
the action giving unit gives the object behavior to each cluster, so that the presence objects collected into the same cluster have the same behavior, and the object behavior given by different clusters is different;
the control unit is internally provided with spontaneous behavior instructions, and when the execution of the outgoing objects in any cluster is completed, the outgoing objects receive the spontaneous behavior instructions and execute the spontaneous behavior instructions.
Preferably, the classification unit defines each of the outgoing objects as an object unit;
each object unit comprises a unit expression element and a unit control element, wherein the unit expression element renders appearance change to the object unit, and the unit control element renders object behavior expression to the object unit;
the classification unit defines the behavior similarity of the field objects according to the similar appearance changes and/or behavior changes.
Preferably, each cluster includes a cluster control element and a cluster command element;
the cluster control element automatically gives the unit receiving the object behavior, and the cluster control element gives the unit control element the object behavior, so that all the object units in the same cluster show the same object behavior;
the cluster command element receives the spontaneous behavior instruction from the control unit, and the cluster command element gives the spontaneous behavior instruction to the unit control element so that part or all of the object units in the same cluster execute the autonomous behavior included in the spontaneous behavior instruction.
Preferably, each object unit is built with a state machine, and the state machine represents the behavior state of the object unit;
the state machine sends a behavior state to the control unit, and the control unit sends a spontaneous behavior instruction to the cluster command element when knowing the behavior state change trigger condition of the object unit.
Preferably, when the object unit receives at least two object performance or spontaneous behavior instructions at the same time, the object unit executes the object performance or spontaneous behavior instruction with the highest priority according to the priorities of the object performance and spontaneous behavior instructions.
Preferably, when the classification unit gathers the outgoing objects into clusters, any one or more of a cluster unit model, a cluster arrangement model, the number of cluster units and a cluster behavior instruction of each cluster are set;
when the action imparting unit imparts an object behavior representation to each cluster, the object behavior representation includes any one or more of a moving target object, a moving target cluster, a moving target position, a moving speed, or a moving pattern.
Preferably, the timeline editor further comprises a reorganization unit;
the reorganization unit monitors object behavior of all or part of object units in any cluster, deconstructs the cluster when the object behavior of part of object units is different, and clusters the object units with the same behavior similarity to a new cluster.
Preferably, the classification unit divides the object unit into a main angle unit and an angle allocation unit according to the importance degree of the object unit, and separates the main angle unit from a cluster collected by the angle allocation unit;
the action giving unit and the control unit give independent principal angle behavior expression and principal angle behavior instruction to the principal angle unit.
The invention also discloses an animation production method,
the time line editor relies on a time line track of the animation, and a classification unit thereof clusters the outgoing objects with the same behavior similarity to a first cluster and a second cluster according to the behavior similarity of the outgoing objects. . . An nth cluster, where n ε Z+;
the action giving unit of the time line editor gives the object behavior to each cluster, so that the outgoing objects collected into the same cluster have the same behavior, and the given object behavior of different clusters is different;
the control unit of the time line editor is internally provided with spontaneous behavior instructions, and when the execution of the outgoing objects in any cluster is completed, the outgoing objects receive the spontaneous behavior instructions and execute the spontaneous behavior instructions.
The invention also discloses a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, realizes the following steps:
the time line editor relies on a time line track of the animation, and a classification unit thereof clusters the outgoing objects with the same behavior similarity to a first cluster and a second cluster according to the behavior similarity of the outgoing objects. . . An nth cluster, where n ε Z+;
the action giving unit of the time line editor gives the object behavior to each cluster, so that the outgoing objects collected into the same cluster have the same behavior, and the given object behavior of different clusters is different;
the control unit of the time line editor is internally provided with spontaneous behavior instructions, and when the execution of the outgoing objects in any cluster is completed, the outgoing objects receive the spontaneous behavior instructions and execute the spontaneous behavior instructions. .
After the technical scheme is adopted, compared with the prior art, the method has the following beneficial effects:
1. when the animation is produced, an animation designer can concentrate on applying design resources on the main angle of the animation or the focus of a user, so that the production time is saved, and the production efficiency is improved;
2. the behaviors of the objects appearing in the animation are richer, and the animation scene can not appear distortion;
3. the maintenance cost in the animation production process is reduced, the multiplexing rate of the animation is improved, and the animation production process is more convenient for cooperation with other designers.
Drawings
FIG. 1 is a schematic diagram of a prior art animation timeline;
FIG. 2 is a schematic diagram of the execution behavior of a single animated object as the time line of animation passes in the prior art;
FIG. 3 is a schematic diagram of the execution behavior of a plurality of animation objects as a time line passes during an animation process in the prior art;
FIG. 4 is a schematic diagram of an animation system, according to a preferred embodiment of the invention;
FIG. 5 is a diagram illustrating the execution behavior of a plurality of clusters as a timeline passes in accordance with a preferred embodiment of the present invention;
FIG. 6 is a flow chart of an animation method according to a preferred embodiment of the invention.
Detailed Description
Advantages of the invention are further illustrated in the following description, taken in conjunction with the accompanying drawings and detailed description.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in this disclosure to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
In the description of the present invention, it should be understood that the terms "longitudinal," "transverse," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships based on the orientation or positional relationships shown in the drawings, merely to facilitate describing the present invention and simplify the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the present invention.
In the description of the present invention, unless otherwise specified and defined, it should be noted that the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be, for example, mechanical or electrical, or may be in communication with each other between two elements, directly or indirectly through intermediaries, as would be understood by those skilled in the art, in view of the specific meaning of the terms described above.
In the following description, suffixes such as "module", "component", or "unit" for representing elements are used only for facilitating the description of the present invention, and are not of specific significance per se. Thus, "module" and "component" may be used in combination.
Referring to FIG. 4, a schematic diagram of an animation system according to a preferred embodiment of the invention is shown. In this embodiment, the animation system includes an animation rendering engine and a timeline editor, where the animation rendering engine may be a unity2019.1 version engine, and the timeline editor is a commonly used editor framework, and may be fully developed independently without depending on a specific development platform. The animation rendering engine will design the formation behavior instructions according to the designer of the animation to be used as control of the out-going objects that appear within the animation. The outgoing object is an object appearing in the picture of the animation, such as everything presented in the picture, such as a man and woman owner, an animal image, a building, weather, magic skills, and the like. The behavior instructions for the outgoing object are image changes such as appearance changes such as shape and structure, appearance changes such as morphology, and position and effect changes.
In this embodiment, to implement the classification type and collection type processing for the outgoing objects, the timeline editor relies on the timeline track of the animation, and includes a classification unit, an action giving unit and a control unit.
The classification unit obtains the behavior instructions of each outgoing unit, or edits the behavior actions of each outgoing object according to the design of the animation designer, for example, the main angle of a man of the animation is designed to lift the left hand, apply boxing, speak with the mouth, or the match angle of the main angle is closed together with the mouth to speak, point to a certain direction, etc., when some outgoing objects have the same or similar behavior, for example, a plurality of objects are designed in the animation picture to simultaneously move from the left side of the picture to the right side of the picture at a similar speed, or a plurality of big trees designed in the animation picture swing along the wind, etc., the outgoing objects are regarded as having approximate behaviors, and after judging that the approximate behaviors are obtained, the outgoing objects are moved along the windThe ex-field objects are clustered into one cluster, and the like, and the ex-field objects are clustered into a first cluster and a second cluster on the basis of the behavior of the ex-field objects in all the ex-field objects. . . Nth cluster where n e Z + . That is, all the outgoing objects are grouped and categorized based on the similarity of the behavior of the outgoing objects. The behavior similarity referred to in this embodiment includes the similarity of appearance changes of a plurality of panelists, such as a regular large mouth, a smiling face, and the like, and also includes the similarity of action behaviors of a plurality of panelists, such as a regular hand lifting, a simultaneous leg lifting, and the like, and also includes the similarity of positions of a plurality of panelists, such as all walking from left to right (regardless of walking distance, only direction). It will be appreciated that, depending on the division of the behavior similarity, the number of clusters may be different, and all the outgoing objects may be clustered in one cluster (first cluster), or every two outgoing objects may be clustered in one cluster (first cluster, second cluster, etc.).
After the collection of the presence objects into clusters, the action imparting unit will impart to each cluster an object behavior such that the presence objects within the same cluster have the same behavior. For example, the first cluster is given object behavior in which these ex-presence objects lift both hands for a certain period of time, and for example, the second cluster is given object behavior in which these ex-presence objects fly from the left side to the right side of the animated picture for a certain period of time. When assigning object behavior to clusters, the object behavior assigned to different clusters is different (if the same, the clusters may merge) such that the presence objects within different clusters perform different actions according to the design. In this embodiment, it may be appreciated that the assignment of the cluster to the action may not have precedence or causal relationships, and first when the outgoing object is clustered within a certain cluster, it is not indicated that the outgoing object has a certain behavior, nor that the outgoing object has a certain similarity of behavior with other outgoing objects within the cluster, but that part of the outgoing objects may or should have a certain behavior according to future designs of animation designers, and are clustered within a certain cluster according to the possible behavior in the future. And a combined action giving unit which gives the object behavior designed when the animation designer gathers out the field object. That is, the classification unit and the action assignment unit are grouped and assigned based on the same future behavior of the animated designer for the presence object.
The control unit is internally provided with spontaneous behavior instructions, after the object is collected and endowed with the object behavior during animation design, the object is collected in a certain cluster after the execution according to the object behavior on the time axis of the animation, but after the object behavior endowed by the action endowing unit is finished according to the design, the corresponding instructions are executed through the control of the spontaneous behavior instructions of the control unit so as to display further behavior. For example, all birds within an animation screen are grouped into a certain cluster, and the object behavior imparted to the cluster by the action imparting means appears to fly from the lower left of the screen to the middle of the screen. When birds, which are the objects of the present, fly from the lower left of the picture to the middle of the picture, the behavior of the object is performed, and the spontaneous behavior instructions are that the birds randomly spread from the middle to the periphery of the picture, so that the whole behavior of the birds in the picture appears to fly from the lower left of the picture to the middle and then spread to the periphery. That is, the control unit gives the cluster a certain behavior autonomy, and when the object performance is completed or some other condition is met, the outgoing objects in the cluster execute random, preset or certain behaviors.
In the whole process of the time line moving, if the cluster where the object unit is located is issued with the object behavior, the object unit generates behavior according to the command of the cluster; if the object unit completes the command of the object behavior of the cluster or does not receive the command of the object behavior of the cluster, spontaneous behavior is generated according to the judgment of the control unit.
Through the configuration, on one hand, after the outgoing objects are collected, a certain action design can be uniformly carried out on part of the outgoing objects in a cluster mode, and on the other hand, the outgoing objects have certain behavior autonomy due to the arrangement of the control unit, so that the thought of animation design personnel on an animation design scheme is saved, and the expressive force of pictures is enriched.
With continued reference to FIG. 4, in a preferred embodiment, to facilitate the design and production of animations, the categorization defines each of the outbound objects as an object unit by unit, and the action design is performed for each outbound object in the form of a minimum unit. Each object unit is provided with a unit expression element and a unit control element, the unit expression element renders appearance changes to the object unit so as to display the appearance changes of the object unit, and the unit control element renders object behavior expressions to the object unit so that the object unit makes certain behaviors. For example, when the object unit receives a command to move to a certain position, the unit control unit controls the object unit to move to the position, and if the object unit collides with other object units, the object unit is avoided. After pre-designing the appearance change and/or the object behavior of the object unit, the classification unit defines the behavior similarity of the object on the basis of the similar appearance change and/or behavior change.
Further, each cluster includes a cluster control element and a cluster command element, the cluster control element being connected to the action imparting unit, receiving the imparted object performance, and being connected to the unit control element to impart the object performance thereto. That is, the object behavior of the plurality of object units designed by the animation designer will be given to each object unit within the cluster by the action giving unit, the cluster control unit, the unit control unit. The cluster command element is connected with the control unit, receives the spontaneous behavior instruction designed by the animation personnel, is connected with the unit control element, and gives the spontaneous behavior instruction to the unit control element, so that part or all of the object units in the same cluster execute the autonomous behaviors included in the spontaneous behavior instruction.
In order to confirm whether the object units perform corresponding object behavior, a state machine is provided in each object unit, wherein the state machine indicates the behavior state of the object unit, for example, the behavior state of the object unit recorded by the state machine may be that the object unit is static, moves in a screen at a certain speed, changes from small to large, and the like. The state machine sends the behavior state of the object unit to the control unit, and after the control unit acquires the behavior state change of the object unit, the control unit acquires whether the object unit has executed the behavior of the object, or triggers other conditions (such as reaching a certain position and executing the action to a certain completion degree), the control unit sends a spontaneous behavior instruction to the cluster command unit, so that the object unit executes the spontaneous behavior. For example, the control unit knows whether the object unit is moving or is an attack target at the current time, and the control unit controls the object unit to perform corresponding actions, for example, the control unit can set to stand in place to wait after the object unit moves to a specified place.
Considering that it is possible to perform the same or different object performance at some moments of the timeline even for the same or similar object units, as shown in fig. 5, some object units are clustered into the second cluster within 1-5 seconds and 6 seconds are clustered into the first cluster (but still can be clustered into the second cluster), in the case that the object units perform the object performance or spontaneous behavior instruction with the highest priority according to the priorities of the object performance and spontaneous behavior instructions when the object units receive at least two object performance or spontaneous behavior instructions at the same moment. Depending on the priority, an object unit in a cluster may stop its current object behavior and instead perform a particular action unless the current object behavior or spontaneous behavior instruction has a higher priority than the object behavior or spontaneous behavior instruction. Such as attacking a certain cluster/object unit, evading a certain cluster/object unit, cluster reorganization, etc., may be used as a cluster command.
In a further preferred embodiment a track model of the control clusters is added in the timeline editor, on which track various cluster commands can be created. Cluster commands include cluster creation, cluster movement, cluster behavior, and the like. Specifically, when the classification unit gathers the outgoing objects into clusters, any one or more of a cluster unit model, a cluster arrangement model, the number of cluster units and a cluster behavior instruction of each cluster are set. The clustered unit model may be a type of clustered unit, such as people, animals, plants, buildings, weather, etc.; the cluster arrangement model is a preset mode of arranging object units in a cluster, such as a straight-line mode, a zigzag mode, an F mode, an S mode and the like; the number of the cluster units represents the number of the object units included in each cluster after being clustered according to the behavior approximation degree; and the cluster behavior instruction is the object behavior. On the one hand, the cluster unit model, the cluster arrangement model, the number of cluster units and the arrangement of the cluster behavior instructions can increase the aggregation speed of the classification units or the endowing speed of the action endowing units in a preset mode. In addition, when the action giving unit gives the object behavior representation to each cluster, the object behavior representation includes any one or more of a target object, a moving target cluster, a moving target position, a moving speed, or a moving mode, wherein the target object is another target object to which the object unit moves, the moving target cluster represents which clusters will move in the screen according to the object behavior representation at a certain moment and rest, the moving target position represents a moving time end position of the object unit, the moving speed represents a speed of the object unit when moving, and the moving mode is a moving form of the object unit when moving, such as a linear form, a spiral form, a folded line form, and the like.
In the above embodiment, the track model of the control cluster added by the timeline editor may also be regarded as the track model set in the control unit, and by recording cluster creation, cluster movement and cluster behavior, the animation designer can control and grasp the animation.
In a further alternative embodiment, the timeline editor further comprises a reorganization unit that monitors the object behavior of all or part of the object units in any cluster, and when the object behavior of part of the object units is different, the reorganization unit issues a command to deconstruct the cluster, separate the object units from the cluster and cluster the object units with the same behavior similarity again to a new cluster, considering that the behavior similarities according to which part of the object units are not consistent in some time periods, i.e. the object units in the cluster are no longer consistent or similar, but should be partitioned into other clusters. It will be appreciated that, for deconstructing a cluster, it is not represented by complete decomposition of the cluster, but rather, some object units in the cluster may be extracted and placed in other clusters, or some object units in the cluster may be extracted from some object units in another cluster or clusters, and after a cluster is newly built, some objects are placed in the new cluster, so that commands are issued to the new clusters to execute corresponding object behavior.
For some object units with more specific behaviors, a separate track can be set for the object unit, and a specific object behavior can be given to the object unit. Specifically, the classification unit divides all object units into a main angle unit and an angle allocation unit according to the importance degree of the object units in the animation, or extracts the main angle unit only from the object units, and after determining the main angle unit, the classification unit divides the cluster of the main angle unit and the angle allocation unit into a cluster (or independent units) independently. And controlling all behaviors of the principal angle unit according to an original time line editing mode, namely configuring principal angle behavior manifestation and principal angle behavior instructions for the principal angle unit by using independent tracks. While the object units in other clusters, i.e. corner units, are presented only for some co-lining the corner units, because the behavior patterns are relatively uniform, the object behavior can still be configured for corner units in a uniform manner as described above.
Using this timeline editor, animation designers can concentrate on animating and rendering the primary character. For other numerous coordination angles, animation designers only need to distribute the coordination angles into a specific cluster, set autonomous behavior instructions of the cluster, set corresponding commands in a time line, and make the behavior of the coordination angles according to the commands. By combining the control unit and the action giving unit, the coordination angles can also show rich behaviors, and the animation scene is not distorted at all. In addition, animation designers do not need to spend excessive time on corner fitting, and can finish large-scale cluster animation by simple operation. Thus, a great deal of time can be saved for the user, and the animation production efficiency is greatly improved.
Example 1
The control unit can control each object unit in the cluster to make a behavior conforming to the command according to the characteristics of the cluster by replacing the behavior of manual editing by the control unit. The control unit will only function if no cluster command is received, which can be understood as a default nature. The cluster command can be used for enabling animation designers to issue commands to a plurality of object units in batches, and the object units are driven to make specific behaviors in a control unit mode, so that the animation designers can very conveniently control roles without specific instructions, and enable the animation designers to make proper behaviors. For example, the animation designer may configure the cluster as a bird cluster, and the control unit may configure the autonomous behavior instructions to fly around randomly. The animation designer may instruct this cluster to fly to a fixed point on the timeline. Each bird in the cluster flies to the fixed point at this time. When they all fly around this fixed point, the object performance is completed. At this point, the autonomous behavior instructions configured by the control unit come into effect, they fly randomly around, exhibiting a leisure state.
Referring to fig. 6, the invention also discloses an animation production method, which comprises the following steps:
s100: the time line editor relies on a time line track of the animation, and a classification unit thereof clusters the outgoing objects with the same behavior similarity to a first cluster and a second cluster according to the behavior similarity of the outgoing objects. . . An nth cluster, where n ε Z+;
s200: the action giving unit of the time line editor gives the object behavior to each cluster, so that the outgoing objects collected into the same cluster have the same behavior, and the given object behavior of different clusters is different;
s300: the control unit of the time line editor is internally provided with spontaneous behavior instructions, and when the execution of the outgoing objects in any cluster is completed, the outgoing objects receive the spontaneous behavior instructions and execute the spontaneous behavior instructions.
The invention also discloses a computer readable storage medium having stored thereon a computer program which when executed by a processor realizes the steps of: the time line editor relies on a time line track of the animation, and a classification unit thereof clusters the outgoing objects with the same behavior similarity to a first cluster and a second cluster according to the behavior similarity of the outgoing objects. . . An nth cluster, where n ε Z+; the action giving unit of the time line editor gives the object behavior to each cluster, so that the outgoing objects collected into the same cluster have the same behavior, and the given object behavior of different clusters is different; the control unit of the time line editor is internally provided with spontaneous behavior instructions, and when the execution of the outgoing objects in any cluster is completed, the outgoing objects receive the spontaneous behavior instructions and execute the spontaneous behavior instructions.
It should be noted that the embodiments of the present invention are preferred and not limited in any way, and any person skilled in the art may make use of the above-disclosed technical content to change or modify the same into equivalent effective embodiments without departing from the technical scope of the present invention, and any modification or equivalent change and modification of the above-described embodiments according to the technical substance of the present invention still falls within the scope of the technical scope of the present invention.

Claims (10)

1. An animation system comprising an animation rendering engine and a timeline editor, the animation rendering engine forming behavioral instructions to control an outgoing object within an animation, characterized in that,
the time line editor relies on a time line track of an animation and comprises a classification unit, an action giving unit and a control unit;
the classification unit clusters the outgoing objects with the same behavioral similarity to a first cluster, a second cluster, an nth cluster according to the behavioral similarity of the outgoing objects, wherein n is E Z +
The action giving unit gives the object behavior to each cluster, so that the presence objects collected into the same cluster have the same behavior, and the object behavior given by different clusters is different;
the control unit is internally provided with spontaneous behavior instructions, and when the execution of the outgoing objects in any cluster is completed, the outgoing objects receive the spontaneous behavior instructions and execute the spontaneous behavior instructions.
2. The animation system of claim 1, wherein,
the classification unit defines each outgoing object as an object unit;
each object unit comprises a unit expression element and a unit control element, wherein the unit expression element renders appearance changes to the object unit, and the unit control element renders object behavior expressions to the object unit;
the classification unit defines the behavior similarity of the field objects according to similar appearance changes and/or behavior changes.
3. The animation system of claim 2, wherein,
each cluster comprises a cluster control element and a cluster command element;
the cluster control element automatically endows the unit with object behavior, and endows the unit control element with object behavior, so that all object units in the same cluster display the same object behavior;
the cluster command element receives the spontaneous behavior instruction from the control unit, and the cluster command element gives the spontaneous behavior instruction to the unit control element, so that part or all of the object units in the same cluster execute the autonomous behavior included in the spontaneous behavior instruction.
4. The animation system of claim 3, wherein,
each object unit is internally provided with a state machine, and the state machine represents the behavior state of the object unit;
and the state machine sends the behavior state to the control unit, and the control unit sends a spontaneous behavior instruction to the cluster command element when knowing the behavior state change trigger condition of the object unit.
5. The animation system of claim 2, wherein,
when the object unit receives at least two object behavior or spontaneous behavior instructions at the same time, the object unit executes the object behavior or spontaneous behavior instruction with the highest priority according to the priorities of the object behavior and spontaneous behavior instructions.
6. The animation system of claim 1, wherein,
when the classification unit gathers the outgoing objects into clusters, any one or more of a cluster unit model, a cluster arrangement model, the number of cluster units and a cluster behavior instruction of each cluster are set;
the action imparting unit imparts an object behavior including any one or more of a moving target object, a moving target cluster, a moving target position, a moving speed, or a moving pattern to each cluster when the object behavior is imparted.
7. The animation system of claim 2, wherein,
the timeline editor also includes a reorganization unit;
the reorganization unit monitors object behavior of all or part of object units in any cluster, deconstructs the cluster when the object behavior of part of object units is different, and clusters the object units with the same behavior similarity to a new cluster.
8. The animation system of claim 2, wherein,
the classifying unit divides the object unit into a main angle unit and an angle allocation unit according to the importance degree of the object unit, and separates the main angle unit from a cluster collected by the angle allocation unit;
the action imparting unit and the control unit impart individual principal angle behavior manifestations and principal angle behavior instructions to the principal angle unit.
9. A method of animation comprising the steps of:
the time line editor relies on a time line track of an animation, and a classification unit thereof classifies the out-of-the-field objects with the same behavior similarity into a first cluster and a second cluster according to the behavior similarity of the out-of-the-field objects +
The action giving unit of the time line editor gives the object behavior to each cluster, so that the outgoing objects collected into the same cluster have the same behavior, and the given object behavior of different clusters is different;
the control unit of the time line editor is internally provided with spontaneous behavior instructions, and when the execution of the outgoing objects in any cluster is completed, the outgoing objects receive the spontaneous behavior instructions and execute the spontaneous behavior instructions.
10. A computer readable storage medium having stored thereon a computer program, characterized in that the computer program when executed by a processor realizes the steps of:
the time line editor relies on a time line track of an animation, and a classification unit thereof classifies the out-of-the-field objects with the same behavior similarity into a first cluster and a second cluster according to the behavior similarity of the out-of-the-field objects +
The action giving unit of the time line editor gives the object behavior to each cluster, so that the outgoing objects collected into the same cluster have the same behavior, and the given object behavior of different clusters is different;
the control unit of the time line editor is internally provided with spontaneous behavior instructions, and when the execution of the outgoing objects in any cluster is completed, the outgoing objects receive the spontaneous behavior instructions and execute the spontaneous behavior instructions.
CN201911346616.3A 2019-12-24 2019-12-24 Animation system, method and computer-readable storage medium Active CN111105482B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911346616.3A CN111105482B (en) 2019-12-24 2019-12-24 Animation system, method and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911346616.3A CN111105482B (en) 2019-12-24 2019-12-24 Animation system, method and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN111105482A CN111105482A (en) 2020-05-05
CN111105482B true CN111105482B (en) 2023-04-25

Family

ID=70423543

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911346616.3A Active CN111105482B (en) 2019-12-24 2019-12-24 Animation system, method and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN111105482B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101743542A (en) * 2007-06-29 2010-06-16 微软公司 Collecting and presenting temporal-based action information

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6686918B1 (en) * 1997-08-01 2004-02-03 Avid Technology, Inc. Method and system for editing or modifying 3D animations in a non-linear editing environment
US20050231512A1 (en) * 2004-04-16 2005-10-20 Niles Gregory E Animation of an object using behaviors
US20130127877A1 (en) * 2011-02-28 2013-05-23 Joaquin Cruz Blas, JR. Parameterizing Animation Timelines
US20130132840A1 (en) * 2011-02-28 2013-05-23 Joaquin Cruz Blas, JR. Declarative Animation Timelines
US8982132B2 (en) * 2011-02-28 2015-03-17 Adobe Systems Incorporated Value templates in animation timelines
US9773336B2 (en) * 2011-06-03 2017-09-26 Adobe Systems Incorporated Controlling the structure of animated documents

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101743542A (en) * 2007-06-29 2010-06-16 微软公司 Collecting and presenting temporal-based action information

Also Published As

Publication number Publication date
CN111105482A (en) 2020-05-05

Similar Documents

Publication Publication Date Title
US11276216B2 (en) Virtual animal character generation from image or video data
JP7403452B2 (en) interactive video game system
Yannakakis Game AI revisited
Bobick et al. The KidsRoom: A perceptually-based interactive and immersive story environment
CN106251389B (en) Method and device for producing animation
Shum et al. Interaction patches for multi-character animation
Hoffman et al. A hybrid control system for puppeteering a live robotic stage actor
US8648863B1 (en) Methods and apparatus for performance style extraction for quality control of animation
CN105447896A (en) Animation creation system for young children
US6756984B1 (en) Object displaying method, a recording medium and game apparatus
CN104867176A (en) Cryengine-based interactive virtual deduction system
CN1949274A (en) 3-D visualising method for virtual crowd motion
CN106683501A (en) AR children scene play projection teaching method and system
US11119509B2 (en) Configuring a color bi-directional pixel-based display screen with stereo sound for light shows using quadcopters
CN111324334B (en) Design method for developing virtual reality experience system based on narrative oil painting works
WO2022048333A1 (en) Method, apparatus, and device for dynamic change of virtual object, and storage medium
CN109345614A (en) The animation simulation method of AR augmented reality large-size screen monitors interaction based on deeply study
CN104504090B (en) The treating method and apparatus of image in a kind of webpage
CN111105482B (en) Animation system, method and computer-readable storage medium
CN105833462A (en) Method and device for controlling treadmill
US20230390653A1 (en) Smoothing server for processing user interactions to control an interactive asset
CN206363575U (en) A kind of paragliding VR simulators
CN108257220A (en) A kind of method of real character as virtual image implantation virtual world
CN113009848B (en) Intelligent music fountain distributed control system
CN215117126U (en) Intelligent projection equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant