CN115035218A - Interactive animation production method and device, computer equipment and storage medium - Google Patents

Interactive animation production method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN115035218A
CN115035218A CN202210961861.0A CN202210961861A CN115035218A CN 115035218 A CN115035218 A CN 115035218A CN 202210961861 A CN202210961861 A CN 202210961861A CN 115035218 A CN115035218 A CN 115035218A
Authority
CN
China
Prior art keywords
data
interactive
real
frame
animation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210961861.0A
Other languages
Chinese (zh)
Other versions
CN115035218B (en
Inventor
钟正阳
孟文亮
田柯
易勇军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Xiangsheng Network Information Co ltd
Original Assignee
Hunan Xiangsheng Network Information Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Xiangsheng Network Information Co ltd filed Critical Hunan Xiangsheng Network Information Co ltd
Priority to CN202210961861.0A priority Critical patent/CN115035218B/en
Publication of CN115035218A publication Critical patent/CN115035218A/en
Application granted granted Critical
Publication of CN115035218B publication Critical patent/CN115035218B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6009Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Abstract

The application relates to an interactive animation production method, an interactive animation production device, computer equipment and a storage medium. The method comprises the following steps: acquiring preset readable and writable data of the real-time non-interactive animation; recording readable write data into a frame of a game engine, recording the readable write data belonging to a binding object corresponding to a self-defined class by using a preset self-defined class to obtain recorded data, and obtaining frame data of a current frame according to the recorded data corresponding to a plurality of self-defined classes in the current frame; playing back the real-time non-interactive animation according to the frame data of the real-time non-interactive animation; and when the user enters the interactive state, receiving the interactive operation of the user to obtain the real-time data corresponding to the current frame interactive operation, and playing the animation in the interactive state according to the real-time data. By adopting the method, the interactive animation production with simple production and various interactive modes can be realized.

Description

Interactive animation production method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to an interactive animation method, an interactive animation device, a computer device, and a storage medium.
Background
With the development of data processing technology, audiences do not meet the requirement of pure video playing when watching videos, videos with interactive links are popular with the audiences, interactive videos comprise products which take plots as cores and can influence contents in any interactive mode.
Real-time animation is the most flexible animation, and because each image is drawn while animation is performed, the content of the next image can be dynamically changed according to needs, and the real-time animation is most interactive. The interactive animation is one of interactive videos, the traditional interactive videos are only limited in selection of video development routes, the interactive mode is single, and the interactive videos are easy to be boring, in interactive games with various interactive modes, a special game engine needs to be designed, the development workload is equivalent to that of the traditional game development, the manufacturing difficulty is high, the manufacturing period is long, and the interactive animation with various interactive modes is urgently required to be solved.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an interactive animation method, apparatus, computer device and storage medium.
A method of interactive animation, the method comprising:
acquiring preset readable and writable data of the real-time non-interactive animation;
recording the readable write data into a frame of a game engine, recording the readable write data belonging to a binding object corresponding to a self-defined class by using a preset self-defined class to obtain recorded data, and obtaining frame data of a current frame according to the recorded data corresponding to a plurality of self-defined classes in the current frame;
according to the frame data of the real-time non-interactive animation, playing back the real-time non-interactive animation;
and when the user enters the interactive state, receiving the interactive operation of the user to obtain the real-time data corresponding to the interactive operation of the current frame, and playing the animation in the interactive state according to the real-time data.
In one embodiment, the method further comprises the following steps: and when the user exits the interactive state, switching to the real-time non-interactive animation.
In one embodiment, the method further comprises the following steps: recording the readable and writable data of the real-time non-interactive animation into a frame of a game engine by using a preset script; the script is used for declaring a record data type; recording readable write data belonging to a binding object corresponding to a custom class according to the custom class preset by a user to obtain recorded data; the plurality of custom classes inherit a parent class; the parent class is used for binding the scene object; and collecting a plurality of self-defined classes of recorded corresponding recorded data in the current frame by using a collector to obtain frame data.
In one embodiment, the method further comprises the following steps: serializing the frame data of the current frame, and storing the serialized frame data as a local file to obtain text data corresponding to the frame data; and loading the local file, and deserializing the local file to obtain the frame data of the current frame.
In one embodiment, the method further comprises the following steps: and distributing the frame data of the current frame to each self-defined class by using a distributor, wherein each self-defined class sets the behavior of the binding object according to the data of the binding object so as to play the real-time non-interactive animation.
In one embodiment, the method further comprises the following steps: and discarding the distributed data of the current frame according to the self-defined class corresponding to the binding object of the real-time data of the current frame, and setting the behavior of the binding object according to the real-time data corresponding to the binding object so as to play the animation in the interactive state.
In one embodiment, the method further comprises the following steps: when the user exits the interactive state, a preset smooth transition method is adopted to interpolate the readable write data in the real-time data and the frame data of the current frame, and the smooth transition is real-time non-interactive animation.
An interactive animation device, the device comprising:
the data acquisition module is used for acquiring the read-write data of the preset real-time non-interactive animation;
the data recording module is used for recording the readable and writable data into a frame of a game engine, recording the readable and writable data belonging to the binding object corresponding to the self-defined class by using a preset self-defined class to obtain recorded data, and obtaining frame data of a current frame according to the recorded data corresponding to a plurality of self-defined classes in the current frame;
the animation playing module is used for playing back the real-time non-interactive animation according to the frame data of the real-time non-interactive animation;
and the interaction module is used for receiving the interaction operation of the user when the user enters the interaction state, obtaining the real-time data corresponding to the interaction operation of the current frame, and playing the animation in the interaction state according to the real-time data.
A computer device comprising a memory storing a computer program and a processor implementing the following steps when the computer program is executed:
acquiring preset readable and writable data of the real-time non-interactive animation;
recording the readable and writable data into a frame of a game engine, recording the readable and writable data belonging to the binding object corresponding to the self-defined class by using a preset self-defined class to obtain recorded data, and obtaining frame data of a current frame according to the recorded data corresponding to a plurality of self-defined classes in the current frame;
according to the frame data of the real-time non-interactive animation, playing back the real-time non-interactive animation;
and when the user enters the interactive state, receiving the interactive operation of the user to obtain the real-time data corresponding to the interactive operation of the current frame, and playing the animation in the interactive state according to the real-time data.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring preset readable and writable data of the real-time non-interactive animation;
recording the readable write data into a frame of a game engine, recording the readable write data belonging to a binding object corresponding to a self-defined class by using a preset self-defined class to obtain recorded data, and obtaining frame data of a current frame according to the recorded data corresponding to a plurality of self-defined classes in the current frame;
according to the frame data of the real-time non-interactive animation, playing back the real-time non-interactive animation;
and when the user enters the interactive state, receiving the interactive operation of the user to obtain the real-time data corresponding to the interactive operation of the current frame, and playing the animation in the interactive state according to the real-time data.
The interactive animation production method, the device, the computer equipment and the storage medium have the advantages that the preset readable write data of the real-time non-interactive animation is obtained, the game engine is utilized to record the readable write data of the real-time non-interactive animation, the frame data of each frame of the real-time non-interactive animation is obtained, then the animation is played back according to the recorded frame data, the readable write data of the binding object are recorded through the custom class in the recording process, the custom class is used for setting the behavior of the binding object according to the readable write data of the binding object in the data playback process, when the user enters the interactive state, the real-time data corresponding to the user operation is obtained, the custom class is used for setting the behavior of the binding object in the interactive state according to the real-time data corresponding to the binding object, and the interaction is completed in a data interaction mode. Due to the diversity of the binding objects in the recording process, the setting of the interaction points has diversity, so that the diversity of user interaction is realized. The embodiment of the invention can convert the real-time non-interactive animation into the interactive animation, and realizes the production of the interactive video with simple production and various interactive modes.
Drawings
FIG. 1 is a flow diagram illustrating a method for interactive animation according to one embodiment;
fig. 2 is a schematic flow chart of processing frame data during video recording and video playing in an embodiment, in which (a) is a schematic flow chart of processing frame data during recording, and (b) is a schematic flow chart of processing frame data during playing;
FIG. 3 is a schematic flow chart of video recording according to one embodiment;
FIG. 4 is a schematic flow chart illustrating video playback according to one embodiment;
FIG. 5 is a flow diagram illustrating interaction state switching in one embodiment;
FIG. 6 is a flow diagram illustrating the initiation of interaction in one embodiment;
FIG. 7 is a flowchart illustrating the end of interaction in one embodiment;
FIG. 8 is a block diagram showing the construction of an interactive animation apparatus according to an embodiment;
FIG. 9 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in FIG. 1, there is provided an interactive animation method comprising the steps of:
and 102, acquiring the read-write data of the preset real-time non-interactive animation.
The real-time animation refers to animation which adopts various algorithms to realize motion control of moving objects, the non-interactive animation is a section of video which is simply played, the excellent playing rate usually depends on elaborate production, the economic cost is higher, and the interactive animation with the interactive link can increase the interest of exploring dramas of audiences, improve the experience feeling of the audiences and improve the playing rate of the video. The readable and writable data refer to data which can be read and written by a game engine such as a camera position, a lens angle, scene brightness and the like, in the embodiment of the invention, the game engine refers to Unity, and in addition, other game engines such as Unreal and the like can also realize the scheme of the invention. It should be understood that the conversion of real-time non-interactive animation into interactive animation is only one of the embodiments of the method of the present invention, and in addition to real-time animation, a general teaching video can be converted into an interactive teaching video capable of interacting with a user by using the method of the present invention.
Taking Unity internal light and shadow adjustment teaching as an example, when a teacher explains in a video, students can change the real-time change of light and shadow experience in the video by dragging a parameter sliding bar displayed in an interactive teaching video.
And step 104, recording the readable and writable data into a frame of the game engine, recording the readable and writable data belonging to the binding object corresponding to the custom class by using a preset custom class to obtain recorded data, and obtaining frame data of the current frame according to the recorded data corresponding to a plurality of custom classes in the current frame.
A class is a reflection in a computer of an entity in the real or thought world, which packages together data and operations on those data, and an object is a variable having a class type. A custom class is a class defined by a developer. And recording the data which can be read and written in Unity, such as the animation track in Unity, the position rotation angle of the object and the like in Unity in advance.
And 106, playing back the real-time non-interactive animation according to the frame data of the real-time non-interactive animation.
And playing back the animation according to the recorded readable and writable data.
And 108, when the user enters the interactive state, receiving the interactive operation of the user to obtain the real-time data corresponding to the current frame interactive operation, and playing the animation in the interactive state according to the real-time data.
The interactive operation of the user, i.e. the input signal of the user, such as mouse movement, keyboard pressing and the like, calls InteractionBegin, and binds the input signal of the user and the corresponding data to obtain real-time data.
And setting the behavior of the binding object in the interactive state by using the custom class according to the real-time data corresponding to the binding object, namely replacing the recorded data with the real-time data in the interactive state, and generating an interactive effect according to the real-time data.
The interactive animation production method comprises the steps of obtaining preset readable write data of the real-time non-interactive animation, recording the readable write data of the real-time non-interactive animation by using a game engine to obtain frame data of each frame of the real-time non-interactive animation, playing back the animation according to the recorded frame data, recording the readable write data of a binding object through a custom class in the recording process, setting the behavior of the binding object according to the readable write data of the binding object by the custom class in the data playback process, obtaining real-time data corresponding to user operation when a user enters an interactive state, setting the behavior of the binding object in the interactive state according to the real-time data corresponding to the binding object by the custom class, and finishing interaction in a data interaction mode. Due to the diversity of the binding objects in the recording process, the setting of the interaction points has diversity, so that the diversity of user interaction is realized. The embodiment of the invention can convert the real-time non-interactive animation into the interactive animation, and realizes the interactive animation production with simple production and various interactive modes.
In an embodiment, as shown in fig. 3, a schematic flow chart of video recording is provided, where the step of recording readable write data into a frame of a game engine, obtaining recorded data by recording the readable write data belonging to a binding object corresponding to a custom class by using a preset custom class, and obtaining frame data of a current frame according to the recorded data corresponding to multiple custom classes in the current frame includes: recording the readable and writable data of the real-time non-interactive animation into a frame of a game engine by using a preset script; the script is used for declaring the record data type; recording readable write data belonging to a self-defined class corresponding binding object according to a self-defined class preset by a user to obtain recorded data; a plurality of custom classes inherit parent classes; the parent class is used for binding the scene object; and collecting a plurality of self-defining classes which record corresponding recording data in the current frame by using a collector to obtain frame data.
In this embodiment, the script records the data that can be read and written by the developer every frame, and the developer makes the data automatically recorded as frames in the Unity engine by declaring a property [ RecordDataType ] in the script, which is unique to the API (Application Program Interface) of the method of the present invention, and the computer language of the script is C # language.
The developer-defined class inherits a parent class [ RecordableObject ], [ RecordableObject ] is a class name of the parent class, and [ RecordableObject ] is unique in the API of the method, inherits the [ MonoBehavior ], [ RecordableObject ] is used for binding the scene object and interacting with the scene object, and the interaction with the scene object comprises acquiring the binding object data and applying the binding object data to the scene object.
Binding a custom class into a scene usually binds things in the scene, such as models of people and articles, cameras, lights and the like, and collecting data (such as positions and the like) of the bound objects in each frame according to program description. Unity typically sets the time interval to 0.02 seconds. The default collector provided in the scene will collect all similar objects and pack all data of the current frame into frame data. In fig. 3, the custom class includes a model class, a camera class, a light class and other classes, and correspondingly, the readable and writable data includes a 3d model, a camera, a light and other classes.
In one embodiment, as shown in fig. 2, a schematic flow chart of processing frame data during video recording and video playing is provided, where (a) is a schematic flow chart of processing frame data during recording, and (b) is a schematic flow chart of processing frame data during playing, before playing a real-time non-interactive animation smoothly according to frame data of a current frame, the method includes: serializing frame data of a current frame, and storing the serialized frame data as a local file to obtain text data corresponding to the frame data; and loading the local file, and deserializing the local file to obtain frame data of the current frame. In this embodiment, the format used for serialization and deserialization is the Json format, and in fig. 2, the recorded data is the readable and writable data recorded in the custom class.
In one embodiment, as shown in fig. 4, a schematic flow chart of video playing is provided, and the step of switching to real-time non-interactive animation according to frame data of a current frame includes: and distributing the frame data of the current frame to each self-defining class by using a distributor, wherein each self-defining class sets the behavior of the binding object according to the data of the binding object so as to play the real-time non-interactive animation.
In this embodiment, a local file is loaded and deserialized into frame data packets in the memory, the first frame takes the frame 1 data of the data packet, the second frame takes the frame 2 data, and so on, each frame can acquire the corresponding frame data, and the distributor allocates the frame data of the current frame to the respective definition class after acquiring the frame data of the current frame. And setting the behavior of the binding object by the respective definition class according to the acquired data, for example, if the acquired data of the model class is { x =1, y =1, z =2}, setting the position of the bound model to be (1,1,2), and setting the position of the next frame to be (1.1,1,2), thereby completing the model movement.
In one embodiment, as shown in fig. 6, there is provided a flowchart of the process of starting the interaction, and playing the animation in the interaction state according to the real-time data includes: and discarding the distributed data of the current frame according to the self-defined class corresponding to the binding object of the real-time data of the current frame, and setting the behavior of the binding object according to the real-time data corresponding to the binding object so as to play the animation in the interactive state.
In this embodiment, a script written by a developer inherits a parent class, and after receiving an input signal of a user, calls InteractionBegin to bind the input signal and corresponding data to obtain real-time data, and changes recorded data into real-time data through the input signal is implemented in an OnPlay () (rewrite function).
When a user enters an interactive state, the custom class discards currently allocated recorded data, and real-time data generated by user interaction is used, for example, recorded data of a previous frame 3d model is { x =1, y =1, z =2}, recorded data of a current frame 3d model is { x =1.1, y =1, z =2}, the model should move forward, and real-time data is { x =0.9, y =1, z =2}, and in the interactive state, the real-time data is used to enable the model to move backward. The effect that the user affects the video content through interaction is achieved, and the interaction effect is enhanced.
In one embodiment, as shown in fig. 7, a schematic flow chart at the end of the interaction is provided, and when the interaction is ended, the playing of the real-time non-interactive animation is resumed, and the animation is viewed, and the smooth transition or direct switching is performed to switch from the interactive state to the non-interactive state.
In an embodiment, as shown in fig. 5, a schematic flowchart of the interactive state switching is provided, and the method further includes: when the user exits the interactive state, switching to real-time non-interactive animation; when the user exits the interactive state, the step of switching to the real-time non-interactive animation comprises the following steps: when the user exits the interactive state, the preset smooth transition method is adopted to interpolate the readable write data in the real-time data and the frame data of the current frame, and the smooth transition is the real-time non-interactive animation. In this embodiment, a smooth transition method is adopted to avoid a hard playing effect during video state switching, so that a user can obtain an immersive experience.
In one embodiment, the method further comprises: and setting an extension script. In this embodiment, the invention records the data during the real-time animation running, and selects whether to play back the recorded data or the interactive real-time data according to whether an operation exists during the playback, so that a user can interactively change the playing state or content of the animation during the animation playing. However, the interactive contents that each developer wants to realize are different, and it is impossible to realize a set of methods covering arbitrary data. The extension script is developed to allow the developer to freely set interactive contents.
In one embodiment, the method further comprises: the plug-in panel is set according to the standard plug-in format of the game engine. In this embodiment, plug-in ensures portability and can be quickly introduced into any item. The plug-in panel is convenient for developers to install plug-ins, set data and the like, and operation is further simplified. The project is a Unity project, and the Unity plug-in is a format provided by a Unity platform and can be imported into any Unity project.
Besides animation, the method can also be applied to teaching videos made based on a game engine or a webpage, and only plug-ins are required to be completed based on the game engine or the webpage.
It should be understood that, although the steps in the flowchart of fig. 1 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 1 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, as shown in FIG. 8, there is provided an interactive animation device, comprising: a data acquisition module 802, a data recording module 804, an animation playing module 806, and an interaction module 808, wherein:
the data acquisition module 802 is configured to acquire read-write data of a preset real-time non-interactive animation;
the data recording module 804 is configured to record readable write data into a frame of the game engine, record the readable write data belonging to a binding object corresponding to a self-defined class by using a preset self-defined class to obtain recorded data, and obtain frame data of a current frame according to the recorded data corresponding to a plurality of self-defined classes in the current frame;
an animation playing module 806, configured to playback the real-time non-interactive animation according to frame data of the real-time non-interactive animation;
and the interaction module 808 is configured to receive an interaction operation of the user when the user enters an interaction state, obtain real-time data corresponding to the current frame interaction operation, and play an animation in the interaction state according to the real-time data.
In one embodiment, the method is further used for switching to the real-time non-interactive animation when the user exits the interactive state.
In one embodiment, the data recording module 804 is further configured to record readable and writable data of the real-time non-interactive animation into a frame of the game engine by using a preset script; the script is used for declaring the record data type; recording readable write data belonging to a binding object corresponding to a custom class according to the custom class preset by a user to obtain recorded data; a plurality of custom classes inherit parent classes; the parent class is used for binding the scene object; and collecting a plurality of self-defining classes of recorded corresponding recorded data in the current frame by using a collector to obtain frame data.
In one embodiment, the processing unit is further configured to serialize frame data of the current frame, and store the serialized frame data as a local file to obtain text data corresponding to the frame data; and loading the local file, and deserializing the local file to obtain frame data of the current frame.
In one embodiment, the animation playing module 806 is further configured to assign the frame data of the current frame to each of the custom classes by using an assignor, and each of the custom classes sets the behavior of the bound object according to the data of the bound object, so as to play the real-time non-interactive animation.
In one embodiment, the interaction module 808 is further configured to discard the allocated data of the current frame according to the self-defined class corresponding to the binding object of the real-time data of the current frame, and set a behavior of the binding object according to the real-time data corresponding to the binding object, so as to play the animation in the interaction state.
In one embodiment, the method is further used for interpolating the readable and writable data in the real-time data and the frame data of the current frame by using a preset smooth transition method when the user exits the interactive state, and the smooth transition is a real-time non-interactive animation.
For specific limitations of the interactive animation device, reference may be made to the above limitations of the interactive animation method, which are not described herein again. The various modules in the interactive animation device described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 9. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an interactive animation method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the configuration shown in fig. 9 is a block diagram of only a portion of the configuration associated with the present application, and is not intended to limit the computing device to which the present application may be applied, and that a particular computing device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, a computer device is provided, comprising a memory storing a computer program and a processor implementing the steps of the method in the above embodiments when the processor executes the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the method in the above-mentioned embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of interactive animation, the method comprising:
acquiring preset readable and writable data of the real-time non-interactive animation;
recording the readable and writable data into a frame of a game engine, recording the readable and writable data belonging to the binding object corresponding to the self-defined class by using a preset self-defined class to obtain recorded data, and obtaining frame data of a current frame according to the recorded data corresponding to a plurality of self-defined classes in the current frame;
according to the frame data of the real-time non-interactive animation, playing back the real-time non-interactive animation;
and when the user enters the interactive state, receiving the interactive operation of the user to obtain the real-time data corresponding to the interactive operation of the current frame, and playing the animation in the interactive state according to the real-time data.
2. The method of claim 1, further comprising:
and when the user exits the interactive state, switching to the real-time non-interactive animation.
3. The method according to any one of claims 1 or 2, wherein the step of recording the readable and writable data into a frame of a game engine, recording the readable and writable data belonging to the binding object corresponding to the custom class by using a preset custom class to obtain recorded data, and obtaining frame data of a current frame according to the recorded data corresponding to a plurality of custom classes in the current frame comprises:
recording the readable and writable data of the real-time non-interactive animation into a frame of a game engine by using a preset script; the script is used for declaring a record data type;
recording readable write data belonging to a binding object corresponding to a custom class according to the custom class preset by a user to obtain recorded data; the plurality of custom classes inherit a parent class; the parent class is used for binding the scene object;
and collecting a plurality of self-defining classes which record corresponding recording data in the current frame by using a collector to obtain frame data.
4. The method of claim 3, wherein before playing the real-time non-interactive animation smoothly according to the frame data of the current frame, the method comprises:
serializing the frame data of the current frame, and storing the serialized frame data as a local file to obtain text data corresponding to the frame data;
and loading the local file, and deserializing the local file to obtain the frame data of the current frame.
5. The method of claim 4, wherein the step of switching to real-time non-interactive animation according to the frame data of the current frame comprises:
and distributing the frame data of the current frame to each self-defining class by using a distributor, wherein each self-defining class sets the behavior of the binding object according to the data of the binding object so as to play the real-time non-interactive animation.
6. The method according to any one of claims 1 or 5, wherein playing the animation in the interactive state according to the real-time data comprises:
and discarding the distributed data of the current frame according to the self-defining class corresponding to the binding object of the real-time data of the current frame, and setting the behavior of the binding object according to the real-time data corresponding to the binding object so as to play the animation in the interactive state.
7. The method of claim 2, wherein the step of switching to real-time non-interactive animation when the user exits the interactive state comprises:
when the user exits the interactive state, the preset smooth transition method is adopted to interpolate the readable write data in the real-time data and the frame data of the current frame, and the smooth transition is the real-time non-interactive animation.
8. An interactive animation device, characterized in that the device comprises:
the data acquisition module is used for acquiring the read-write data of the preset real-time non-interactive animation;
the data recording module is used for recording the readable and writable data into a frame of a game engine, recording the readable and writable data belonging to the binding object corresponding to the self-defined class by using a preset self-defined class to obtain recorded data, and obtaining frame data of a current frame according to the recorded data corresponding to a plurality of self-defined classes in the current frame;
the animation playing module is used for playing back the real-time non-interactive animation according to the frame data of the real-time non-interactive animation;
and the interaction module is used for receiving the interaction operation of the user when the user enters the interaction state, obtaining the real-time data corresponding to the interaction operation of the current frame, and playing the animation in the interaction state according to the real-time data.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202210961861.0A 2022-08-11 2022-08-11 Interactive animation production method and device, computer equipment and storage medium Active CN115035218B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210961861.0A CN115035218B (en) 2022-08-11 2022-08-11 Interactive animation production method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210961861.0A CN115035218B (en) 2022-08-11 2022-08-11 Interactive animation production method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115035218A true CN115035218A (en) 2022-09-09
CN115035218B CN115035218B (en) 2022-11-01

Family

ID=83129890

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210961861.0A Active CN115035218B (en) 2022-08-11 2022-08-11 Interactive animation production method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115035218B (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5983190A (en) * 1997-05-19 1999-11-09 Microsoft Corporation Client server animation system for managing interactive user interface characters
US20120021828A1 (en) * 2010-02-24 2012-01-26 Valve Corporation Graphical user interface for modification of animation data using preset animation samples
US20130332167A1 (en) * 2012-06-12 2013-12-12 Nuance Communications, Inc. Audio animation methods and apparatus
CN104077797A (en) * 2014-05-19 2014-10-01 无锡梵天信息技术股份有限公司 Three-dimensional game animation system
CN104715500A (en) * 2015-03-26 2015-06-17 金陵科技学院 3D animation production development system based on three-dimensional animation design
CN105354872A (en) * 2015-11-04 2016-02-24 深圳墨麟科技股份有限公司 Rendering engine, implementation method and producing tools for 3D web game
US20170278291A1 (en) * 2016-03-25 2017-09-28 Microsoft Technology Licensing, Llc Multi-Mode Animation System
CN107767433A (en) * 2017-09-30 2018-03-06 镇江景天信息科技有限公司 A kind of system of exploitation and composition 3D animations
CN108256062A (en) * 2018-01-16 2018-07-06 携程旅游信息技术(上海)有限公司 Web animation implementation method, device, electronic equipment, storage medium
CN109242938A (en) * 2018-09-12 2019-01-18 福建天晴数码有限公司 A kind of animation editing method and terminal based on Unity
CN109523613A (en) * 2018-11-08 2019-03-26 腾讯科技(深圳)有限公司 Data processing method, device, computer readable storage medium and computer equipment
CN109828791A (en) * 2018-12-28 2019-05-31 北京奇艺世纪科技有限公司 A kind of animation playing method, terminal and computer readable storage medium
CN109861948A (en) * 2017-11-30 2019-06-07 腾讯科技(成都)有限公司 Virtual reality data processing method, device, storage medium and computer equipment
US20190261062A1 (en) * 2018-02-22 2019-08-22 Microsoft Technology Licensing, Llc Low latency broadcasting of game engine frames
CN110858408A (en) * 2018-08-07 2020-03-03 奥多比公司 Animation production system
CN111145317A (en) * 2019-12-26 2020-05-12 北京像素软件科技股份有限公司 Game animation playing method and device and server
CN113546409A (en) * 2020-04-24 2021-10-26 福建天晴数码有限公司 Method for playing GIF format resource by Laya engine and storage medium
WO2022021686A1 (en) * 2020-07-28 2022-02-03 完美世界(北京)软件科技发展有限公司 Method and apparatus for controlling virtual object, and storage medium and electronic apparatus
CN114344918A (en) * 2021-11-02 2022-04-15 腾讯科技(深圳)有限公司 Data recording method, device, equipment, medium and product based on virtual scene

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5983190A (en) * 1997-05-19 1999-11-09 Microsoft Corporation Client server animation system for managing interactive user interface characters
US20120021828A1 (en) * 2010-02-24 2012-01-26 Valve Corporation Graphical user interface for modification of animation data using preset animation samples
US20130332167A1 (en) * 2012-06-12 2013-12-12 Nuance Communications, Inc. Audio animation methods and apparatus
CN104077797A (en) * 2014-05-19 2014-10-01 无锡梵天信息技术股份有限公司 Three-dimensional game animation system
CN104715500A (en) * 2015-03-26 2015-06-17 金陵科技学院 3D animation production development system based on three-dimensional animation design
CN105354872A (en) * 2015-11-04 2016-02-24 深圳墨麟科技股份有限公司 Rendering engine, implementation method and producing tools for 3D web game
US20170278291A1 (en) * 2016-03-25 2017-09-28 Microsoft Technology Licensing, Llc Multi-Mode Animation System
CN107767433A (en) * 2017-09-30 2018-03-06 镇江景天信息科技有限公司 A kind of system of exploitation and composition 3D animations
CN109861948A (en) * 2017-11-30 2019-06-07 腾讯科技(成都)有限公司 Virtual reality data processing method, device, storage medium and computer equipment
CN108256062A (en) * 2018-01-16 2018-07-06 携程旅游信息技术(上海)有限公司 Web animation implementation method, device, electronic equipment, storage medium
US20190261062A1 (en) * 2018-02-22 2019-08-22 Microsoft Technology Licensing, Llc Low latency broadcasting of game engine frames
CN110858408A (en) * 2018-08-07 2020-03-03 奥多比公司 Animation production system
CN109242938A (en) * 2018-09-12 2019-01-18 福建天晴数码有限公司 A kind of animation editing method and terminal based on Unity
CN109523613A (en) * 2018-11-08 2019-03-26 腾讯科技(深圳)有限公司 Data processing method, device, computer readable storage medium and computer equipment
CN109828791A (en) * 2018-12-28 2019-05-31 北京奇艺世纪科技有限公司 A kind of animation playing method, terminal and computer readable storage medium
CN111145317A (en) * 2019-12-26 2020-05-12 北京像素软件科技股份有限公司 Game animation playing method and device and server
CN113546409A (en) * 2020-04-24 2021-10-26 福建天晴数码有限公司 Method for playing GIF format resource by Laya engine and storage medium
WO2022021686A1 (en) * 2020-07-28 2022-02-03 完美世界(北京)软件科技发展有限公司 Method and apparatus for controlling virtual object, and storage medium and electronic apparatus
CN114344918A (en) * 2021-11-02 2022-04-15 腾讯科技(深圳)有限公司 Data recording method, device, equipment, medium and product based on virtual scene

Also Published As

Publication number Publication date
CN115035218B (en) 2022-11-01

Similar Documents

Publication Publication Date Title
CN111294663B (en) Bullet screen processing method and device, electronic equipment and computer readable storage medium
US7336280B2 (en) Coordinating animations and media in computer display output
US20050204287A1 (en) Method and system for producing real-time interactive video and audio
US10417308B2 (en) Commenting dynamic content
US11620784B2 (en) Virtual scene display method and apparatus, and storage medium
CN107277412B (en) Video recording method and device, graphics processor and electronic equipment
US11941728B2 (en) Previewing method and apparatus for effect application, and device, and storage medium
CN110784753B (en) Interactive video playing method and device, storage medium and electronic equipment
CN110475140A (en) Barrage data processing method, device, computer readable storage medium and computer equipment
JP2015531097A (en) Animation reproduction method, apparatus, device, program, and recording medium
US20140282000A1 (en) Animated character conversation generator
CN109413352B (en) Video data processing method, device, equipment and storage medium
CN110362375A (en) Display methods, device, equipment and the storage medium of desktop data
CN115035218B (en) Interactive animation production method and device, computer equipment and storage medium
WO2018049682A1 (en) Virtual 3d scene production method and related device
CN111866403A (en) Video graphic content processing method, device, equipment and medium
CN113559503B (en) Video generation method, device and computer readable medium
CN116095388A (en) Video generation method, video playing method and related equipment
CN114449334A (en) Video recording method, video recording device, electronic equipment and computer storage medium
WO2022183866A1 (en) Method and apparatus for generating interactive video
CN117876545A (en) Animation generation method, device, electronic equipment and storage medium
CN116437153A (en) Previewing method and device of virtual model, electronic equipment and storage medium
CN117576281A (en) Grating animation production method and device
CN116647733A (en) Virtual model click event processing method and device, electronic equipment and storage medium
CN115988233A (en) Live broadcast entrance show display processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant