CN108287718B - Special effect editing method and device based on game engine - Google Patents

Special effect editing method and device based on game engine Download PDF

Info

Publication number
CN108287718B
CN108287718B CN201710387683.4A CN201710387683A CN108287718B CN 108287718 B CN108287718 B CN 108287718B CN 201710387683 A CN201710387683 A CN 201710387683A CN 108287718 B CN108287718 B CN 108287718B
Authority
CN
China
Prior art keywords
special effect
nodes
node
file
special
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710387683.4A
Other languages
Chinese (zh)
Other versions
CN108287718A (en
Inventor
张士凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Idreamsky Technology Co ltd
Original Assignee
Shenzhen Idreamsky Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Idreamsky Technology Co ltd filed Critical Shenzhen Idreamsky Technology Co ltd
Priority to CN201710387683.4A priority Critical patent/CN108287718B/en
Publication of CN108287718A publication Critical patent/CN108287718A/en
Application granted granted Critical
Publication of CN108287718B publication Critical patent/CN108287718B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • G06F8/44Encoding
    • G06F8/447Target code generation

Abstract

The invention discloses a special effect editing method based on a game engine. The method comprises the following steps: loading a project file by a special effect editor and responding to the project file; acquiring special effect nodes in the engineering files; receiving the update made by the user for the special effect node; and displaying the special effect of the special effect node by previewing the engineering file. According to the scheme, the special effect editor loads the engineering file and responds to the engineering file, then the special effect nodes in the engineering file are obtained, the updating of the special effect nodes by a user is received, and finally the special effect of the special effect nodes is displayed through previewing, so that the special effect editor can visually and timely display the special effect of the special effect nodes, the special effect editor has the function of visual operation, and the manufacturing efficiency and reliability of the special effect nodes are improved.

Description

Special effect editing method and device based on game engine
Technical Field
The invention relates to the field of game special effect design, in particular to a visual special effect editing method and a computer readable storage medium.
Background
The Game special effects are mainly developed and manufactured through a Game engine, the Game engine is a Game development tool and mainly comprises a cos, an unknown engine, a Unity, a Director, a Blender Game engine, Virtools, a Torque Game Builder and the like, the Game engine can be divided into two types of Game engines with a visible operation function and a non-visible operation function, the Game engine does not have an operability function, and generally has the characteristics of source opening, cross-platform, sufficient openness, strong material library, special effect library and the like, so that a Game developer can develop, reform or customize a special effect library meeting the requirements of the Game developer at will, for example: cocos2dx, which is one of the family of Cocos engines. While other engine platforms, such as Unity, have visual interfaces, but are not open-source, which is not conducive to game developers to develop, reform, or customize class libraries that meet their needs.
The special effect editor based on the Cocos2dx is mainly developed and manufactured by a software engineer in a program coding mode, and generally, after the engineer completes a certain work progress (manufacturing step), the engineer executes program codes, and at this time, some links may be found to be wrong. After a series of development and manufacturing steps, if a certain part of codes in the program codes are wrong or the imported material data is incorrect, the expected special effect cannot be achieved, at this time, the previous program codes need to be debugged, and particularly, under the condition that the program code amount is particularly large, the labor and time cost for debugging is very much, so that the working efficiency is directly reduced.
Because the current coding part and the special effect of the special effect node can not be directly displayed in the window, and a software engineer is required to carry out program coding to cooperate, the development and manufacturing process of the special effect node is quite complex, and because all links of the development and manufacturing are mutually crossed, after a problem is generated, a plurality of links are generally required to be checked to solve the problem, the work of each responsible person is not easy to separate, and the communication cost is higher.
In a word, due to the adoption of a coded special effect editing scheme, the special effect cannot be visually and timely displayed, so that the probability of problems in the development and manufacturing process of special effect nodes is greatly improved, extra manpower and time cost are required to solve the problems, and the efficiency of development and manufacturing of the special effect nodes is reduced.
Disclosure of Invention
The invention provides a special effect editing method based on a game engine and a computer readable storage medium, aiming at solving the problem that the special effect of a special effect node in a special effect editor can not be displayed intuitively and timely in the related technology.
A game engine based special effect editing method, the method comprising the steps of:
loading a project file by a special effect editor and responding to the project file;
acquiring special effect nodes in the engineering file;
receiving the update made by a user for the special effect node;
and displaying the special effect of the special effect node by previewing the engineering file.
A game engine-based special effects editing apparatus, the apparatus comprising:
the response module is used for loading the project file by the special effect editor and responding to the project file;
the acquisition module is used for acquiring special effect nodes in the engineering file;
the receiving module is used for receiving the update of the user for the special effect node;
and the display module is used for displaying the special effect of the special effect node by previewing the engineering file.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects:
according to the scheme, the special effect editor loads the engineering file and responds to the engineering file, then the special effect nodes in the engineering file are obtained, the updating of the special effect nodes by a user is received, and finally the special effect of the special effect nodes is displayed through previewing, so that the special effect editor can visually and timely display the special effect of the special effect nodes, the special effect editor has the function of visual operation, and the manufacturing efficiency and reliability of the special effect nodes are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a schematic illustration of an implementation environment in accordance with the present invention;
FIG. 2 is a flow diagram illustrating a method for game engine based effect editing, according to an exemplary embodiment;
FIG. 3 is a flowchart illustrating details of the embodiment of FIG. 2 before step S270;
FIG. 4 is a diagram illustrating a game engine based effect editing apparatus according to an exemplary embodiment;
FIG. 5 is a block diagram illustrating a terminal 100 in accordance with an example embodiment;
FIG. 6 is a block diagram illustrating a game engine based special effects editing apparatus, according to an example embodiment;
fig. 7 is a block diagram illustrating a game engine based effect editing apparatus according to another exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The Cocos2dx is an open-source cross-platform mobile 2D game framework, which is issued under MIT (the MIT license), and the scheme is based on the implementation environment of the game engine. The Cocos2dx project can be easily built and run in iOS, Android, Blackberry, etc. operating systems. The Cocos2dx also supports desktop operating systems such as Windows, Mac and Linux, so that source code written by developers can be easily edited and debugged in the desktop operating systems.
At present, a special effect editor based on the Cocos2dx is mainly processed in a program code mode, the implementation process of the special effect editing scheme is quite complex, the work of each responsible person cannot be separated, a software engineer is required to carry out program coding for cooperation, all links are mutually crossed, and the communication cost is high.
Therefore, the pure coding type special effect editing method cannot realize the visual editing of the special effect nodes, thereby directly reducing the efficiency and the reliability of the special effect node manufacturing.
Fig. 1 is a schematic diagram of an implementation environment according to the present invention, where the implementation environment is a special effect editor, and the special effect editor mainly includes, in a process of making a special effect node: an engineering file 110, at least one effect node 120, and an effect 160.
The engineering file 110 is a program code file generated by a special effect editor or a program code, and the engineering file in this scheme may be an XML (Extensible Markup Language) file.
The special effect nodes 120 include picture nodes, particle nodes, and the like, and a user can create new special effect nodes as needed, add corresponding materials, such as pictures, to the special effect nodes, and directly import existing special effect nodes, such as particle nodes.
The special effect 160 is a result displayed after the special effect editor previews the project file.
As shown in fig. 1, the special effects editor integrates the Cocos2dx engine into the VIEW (display interface) window of the MFC (Microsoft Foundation Classes), i.e., the Cocos2dx engine can be directly operated on the MFC. Firstly, a corresponding node tree is built inside for management, special effect release is realized by modifying the special effect nodes of the engine, actions of the special effect nodes are generated by utilizing action types, and after modification is completed, the reading or reading special effects can be stored by utilizing XML to generate the corresponding special effect nodes, so that the online maintenance and expansion of the special effects are realized. In addition, the name of any node in the node tree can be used for identifying a specific space name in the game, so that corresponding numbers and the like can be replaced, edited and the like.
On the view window, many functions are integrated, such as loading XML files, editor input, saving XML files, adding special effect nodes, adding action types, editor effect previews, etc. The added special effect nodes comprise picture nodes, particle nodes and the like, and the added action types comprise parallel actions, circulating actions and the like. The picture nodes are special effect nodes which realize special effect requirements by setting different processing modes for one or more pictures, the particle nodes are special effect nodes manufactured by adopting a particle system, the particle system is a technology for simulating a specific phenomenon in computer graphics, and the particle system has the unique advantage of simulating natural phenomena, physical phenomena and space distortion, provides convenience for realizing certain real and natural special effects (such as explosion, fireworks and water flow) with randomness, and provides a powerful particle system in a Cocos2dx engine.
Further, for example, when a special effect node is added, a picture node is selected, and at this time, when a picture or a picture file path is added, a picture corresponding to the picture or the picture file path is automatically displayed in the preview frame.
Furthermore, the added picture nodes also comprise hierarchy attributes, namely, the hierarchy attributes are provided with a plurality of hierarchies, namely the characteristic that a plurality of pictures are simultaneously set, and the special effect is realized by setting time intervals or the sequence of the plurality of pictures.
Furthermore, the method comprises adding picture nodes and rendering modes, wherein the rendering modes are various, and different rendering modes can be selected to realize different special effect effects.
Further, the processing manner corresponding to the action type may also be integrated, that is, a function of adding a meta-action may be integrated, wherein the adding the meta-action includes moving, zooming, rotating, etc., and the meta-action provides further refinement to the action type, for example, the meta-action corresponding to the loop action may be rotating.
Further, for example, "move action" is added to specify the movement time and the coordinate of the movement end, and "MOBEBY action" is added to specify the offset value and time from the original coordinate after the movement.
Further, a plurality of execution types may be set for each meta action, and a default shift execution type may also be set in advance, thereby forcibly changing the parameter variation curve in the meta action.
Further, the meta-action may also be processed by increasing or decreasing the speed, rotating clockwise or counterclockwise, zooming frequency, controlling the duration, and moving or zooming speed.
FIG. 2 is a flow diagram illustrating a game engine based effect editing method according to an exemplary embodiment. As shown in fig. 2, the game engine-based effect editing method may include the following steps.
In step S210, the special effects editor loads and responds to the project file.
The project file is a file created or generated by the special effect editor, and in this embodiment, the project file is mainly an XML file, such as the project file 110 in fig. 1, and the file name of the project file is "wnd _ skillready. The loading refers to that when a user creates or imports an existing project file through a special effect editor, the special effect editor directly loads the project file according to the operation of the user, and simultaneously responds to the content in the project file in the process of loading the project file, namely, the response refers to the response made to the content in the project file. For example, when an existing project file is imported into the special effects editor, corresponding parameters or data contents in the project file are imported into corresponding content items, and if a project file is newly created, default parameters or data contents preset by the special effects editor are imported.
The method has the function of preparing the engineering files of the special effect nodes for the subsequent steps.
In step S230, a special effect node in the project file is acquired.
The special effect nodes include picture nodes, particle nodes and the like, data parameters of corresponding special effect nodes are stored in the engineering file, if the new special effect nodes are established, the obtained special effect nodes are preset blank nodes or default nodes, and after the special effect nodes are obtained through the engineering file, a user can perform corresponding operations on the nodes, such as modifying the names of the special effect nodes, adding corresponding action types, modifying materials in the special effect nodes and the like.
Through the step, data parameters and the like related to the special effect nodes can be obtained from the engineering files, and the data parameters can be displayed to a user in a node tree, list and other visual display modes, such as the node number display mode in fig. 1.
In step S250, an update made by the user for the special effects node is received.
Typically, the project files include both newly created project files and already existing project files. For these engineering files, the user can import the materials prepared in advance according to the design requirement of the special effect node, and these materials are all processed or designed in advance according to the design requirement of the special effect node, as shown in fig. 1, the material of the picture 130 is added to the special effect node 120, and these materials usually include pictures, animations, sounds, etc. Meanwhile, operations such as modification of data parameters in the special effect nodes can be directly performed, for example, for the engineering files, a user can name the special effect nodes, relevant materials are imported for the special effect nodes, different action types are set, and the like, and the special effect nodes can be added, including particle nodes directly imported into a game engine.
Through the steps, after the user modifies the special effect nodes and the like, the special effect editor can update the data parameters and the like of the special effect nodes.
In step S270, the effect of the effect node is displayed by previewing the project file.
After the special effect node is updated, the special effect editor can preview the engineering file, so that the updated special effect of the special effect node is displayed in real time.
Through the steps, the user can timely find whether the operation of the user on the special effect node is consistent with the expected effect of the user, and if the effect obtained through previewing is inconsistent, the user can timely remedy the operation, so that the time for finding problems or searching the problems can be effectively shortened.
By utilizing the process, the engineering file is loaded, the special effect node is obtained from the engineering file, the special effect node is updated according to the operation of a user, and the updated special effect node is previewed, so that the special effect of the special effect node is displayed. Therefore, when the data parameters of the special effect nodes change, the special effect editor can preview and display the change in time, a user can visually see the display result, convenience is provided for the user in the manufacturing process of the special effect nodes, and reference can be provided for the user to perform subsequent design or modification, so that the special effect editor has the function of visible operation, the efficiency is effectively improved, and the accuracy and the reliability of the special effect nodes are further ensured.
In another exemplary embodiment illustrating a game engine-based effect editing method, step S250 may include step S251.
In step S251, the action type assigned to the special effect node is acquired.
The action types include parallel actions, circular actions and the like, that is, a user selects or allocates an action type capable of achieving a special effect or corresponding to the special effect for a special effect node.
The step has the function of setting diversified special effect effects for the special effect nodes and enriching the types of the special effects.
In another exemplary embodiment illustrating the game engine-based effect editing method, after step S251, the method further includes step S253.
In step S253, a meta-action assigned to the action type is acquired.
After the action type allocated to the special effect node is obtained, the action type may be further refined, for example, it is a picture node that is added, the action type allocated to the picture node is a loop action, and for the loop action, refined meta-actions such as rotation, zoom, rotation, and the like may be further added.
Through the steps, the special effect nodes with more types and more refined effects can be realized, so that the more refined special effect can be provided.
In another exemplary embodiment illustrating the game engine-based effect editing method, after step S257, the method further includes step S258.
In step S258, the adjustment to the project file according to the special effect is obtained.
After analyzing according to the project file or the special effect displayed after the preview of the special effect node or the result displayed after the preview of the imported material, if the inconsistency is found, the user can immediately modify the data content in the special effect node or adjust or replace the imported material. After a user modifies the special effect node or changes the imported material, the special effect editor can acquire data modified, adjusted or changed according to the modification.
Through the steps, after the special effect or the material is previewed and displayed, the user can correct the special effect node or the material in time according to the previewing and displaying result, and then the expected special effect can be rapidly realized, so that the time is effectively shortened, and the efficiency is improved.
Fig. 3 is a flowchart for describing details before step S270 of the embodiment in fig. 2, and before step S270, the method further includes the following steps.
In step S261, the hierarchy of the special effect node is corrected and the rendering frequency is reduced.
The hierarchy is a hierarchy, namely a nested hierarchy, and is mainly determined according to whether special effect nodes are overlapped or not, if the special effect nodes are not overlapped, the special effect nodes are corrected to be the same hierarchy, the correction is automatically realized through a special effect editor, the drawing frequency can be reduced by correcting the hierarchy, and the correction is automatically performed in the process of generating preview for the special effect nodes. In the previewing process, the special effect editor directly obtains the activation time of the special effect nodes, then calculates the execution time of the special effect nodes, and synchronizes the execution time of the whole special effect according to the activation time and the execution time.
Specifically, the activation time is preset by the user in the special effect editor according to the effect of the special effect node. The execution time refers to the time from the start of playing the special effect to the end of playing, and after all the execution times are accumulated, the execution time is equal to the execution time of the whole special effect.
Through the steps, the special effect nodes can be preprocessed, and the drawing frequency can be reduced after the levels of the special effect nodes are corrected, so that the operation efficiency is effectively improved.
In step S263, the picture materials in the special effect node are merged.
After the preview is executed, the special effect editor can automatically combine the pictures in the special effect nodes. Before responding to the special effect of the XML file preview, merging picture materials, wherein merging refers to merging a plurality of pictures, for example, when picture nodes are added, a plurality of pictures are often added, the plurality of pictures are arranged in different sequences in order to achieve the expected special effect, and when the XML file is previewed, a special effect editor automatically merges according to data (picture material data) of the XML file.
Through the steps, the number of pictures can be effectively reduced, and therefore the number of vertex drawing is effectively reduced.
In step S265, the number of times of drawing the vertex and the number of times of drawing call of each frame of the graphics processing library are calculated.
When the special effect editor executes preview, the drawing number and the drawing call number of the vertex of each frame of Graphics processing Library are calculated, wherein the Graphics processing Library is OpenGL (Open Graphics Library).
And preparing for adjusting subsequent calling frequency through the vertex drawing number and drawing calling number obtained through calculation.
In step S267, the adjustment of the calling frequency of the special effect node according to the drawing number and the drawing calling frequency is obtained.
The user can judge the calling frequency of the special effect node according to the drawing number and the drawing calling frequency calculated when the special effect editor executes the preview, so that the calling frequency of the special effect node can be adjusted, and particularly the complex special effect node can be adjusted.
After the special effect nodes are adjusted according to the calling frequency, the execution efficiency can be effectively optimized, and the execution efficiency of the special effect nodes is improved, so that the operation efficiency of the special effect nodes is improved.
FIG. 4 is a diagram illustrating a game engine based effect editing apparatus according to an exemplary embodiment. The following describes one exemplary effect editing.
Specifically, as shown in fig. 4, firstly, the XML file is loaded (including the newly created XML file and the existing XML file), after the engineering file is loaded, the special effect node or the material can be edited in the special effect editor according to the pre-designed scheme (wherein the editing includes increase and decrease of the special effect node, increase and decrease of the sequence action, increase and decrease of the meta action, and adjustment of the related event parameter, etc.), after the user inputs the related data parameter through the editor, the special effect editor window can preview the effect, for example, a picture (similarly, other materials) is added, and then the picture can be directly displayed in the special effect editor window, and the picture can be visually displayed to the user, so that the user can make adjustment or correction in time according to the display effect of the picture, and the XML file can be saved during the operation engineering of the editor, all nodes in the special effects editor may also be previewed to display the special effects. The special effects nodes 530 in fig. 6 may include picture nodes, particle nodes, etc., the sequence actions include parallel actions, loop actions, etc., and the meta actions include move, zoom, rotate, etc.
Fig. 5 is a block diagram illustrating a terminal 100 according to an example embodiment. The terminal 100 may be implemented as a computer device in the above-described implementation environment.
Referring to fig. 5, the terminal 100 may include one or more of the following components: a processing component 101, a memory 102, a power component 103, a multimedia component 104, an audio component 105, a sensor component 107 and a communication component 108. The above components are not all necessary, and the terminal 100 may add other components or reduce some components according to its own functional requirements, which is not limited in this embodiment.
The processing component 101 generally controls overall operations of the terminal 100, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 101 may include one or more processors 109 to execute instructions to perform all or a portion of the above-described operations. Further, the processing component 101 may include one or more modules that facilitate interaction between the processing component 101 and other components. For example, the processing component 101 may include a multimedia module to facilitate interaction between the multimedia component 104 and the processing component 101.
The memory 102 is configured to store various types of data to support operations at the terminal 100. Examples of such data include instructions for any application or method operating on terminal 100. The Memory 102 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as an SRAM (Static Random Access Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), an EPROM (Erasable Programmable Read-Only Memory), a PROM (Programmable Read-Only Memory), a ROM (Read-Only Memory), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk. Also stored in memory 102 are one or more modules configured to be executed by the one or more processors 109 to perform all or part of the steps of the method shown in any of fig. 2 and 3.
The power supply component 103 provides power to the various components of the terminal 100. The power components 103 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the terminal 100.
The multimedia component 104 includes a screen providing an output interface between the terminal 100 and the user. In some embodiments, the screen may include an LCD (Liquid Crystal Display) and a TP (Touch Panel). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The audio component 105 is configured to output and/or input audio signals. For example, the audio component 105 includes a microphone configured to receive external audio signals when the terminal 100 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 102 or transmitted via the communication component 108. In some embodiments, audio component 105 also includes a speaker for outputting audio signals.
The sensor assembly 107 includes one or more sensors for providing various aspects of state assessment for the terminal 100. For example, the sensor assembly 107 can detect an open/close state of the terminal 100, relative positioning of the components, a change in position of the terminal 100 or a component of the terminal 100, and a change in temperature of the terminal 100. In some embodiments, the sensor assembly 107 may also include a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 108 is configured to facilitate communications between the terminal 100 and other devices in a wired or wireless manner. The terminal 100 may access a WIreless network based on a communication standard, such as WiFi (WIreless-Fidelity), 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 108 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the Communication component 108 further includes a Near Field Communication (NFC) module to facilitate short-range Communication. For example, the NFC module may be implemented based on an RFID (Radio Frequency Identification) technology, an IrDA (Infrared Data Association) technology, an UWB (Ultra-Wideband) technology, a BT (Bluetooth) technology, and other technologies.
In an exemplary embodiment, the terminal 100 may be implemented by one or more ASICs (Application Specific Integrated circuits), DSPs (Digital Signal processors), PLDs (Programmable Logic devices), FPGAs (Field-Programmable Gate arrays), controllers, microcontrollers, microprocessors or other electronic components for performing the above-described methods.
The specific manner in which the processor of the terminal in this embodiment performs operations has been described in detail in the embodiment related to the game engine-based effect editing method, and will not be described in detail here.
In an exemplary embodiment, a storage medium is also provided that is a computer-readable storage medium, such as may be transitory and non-transitory computer-readable storage media, including instructions. The storage medium includes, for example, a memory 102 of instructions executable by a processor 109 of the terminal 100 to perform the game engine based effect editing method described above.
Optionally, the present invention further provides a terminal, which executes all or part of the steps of the special effect editing method based on the game engine shown in any one of fig. 2 and fig. 3. The terminal includes:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor or the processor reads a computer program from a computer readable storage medium, the instructions or the computer program being executable by the at least one processor to enable the at least one processor to perform a game engine based effect editing method as shown in any of the above exemplary embodiments.
The following is an embodiment of the apparatus of the present invention, which can be used to implement the embodiment of the special effect editing method based on the game engine. For details that are not disclosed in the embodiments of the device of the present invention, please refer to the embodiments of the method for editing special effects based on game engine of the present invention.
FIG. 6 is a block diagram of a game engine based effect editing apparatus, which may be used in the implementation environment shown in FIG. 1 to perform all or some of the steps of the game engine based effect editing method shown in FIG. 2, according to an example embodiment. As shown in fig. 6, the special effect editing apparatus includes, but is not limited to: a response module 310, an acquisition module 330, a receiving module 350, and a display module 370.
The response module 310 is used for the special effect editor to load the project file and respond to the project file.
The obtaining module 330 is configured to obtain special effect nodes in the engineering file.
A receiving module 350, configured to receive an update made by a user for the special effect node.
And a display module 370, configured to display the special effect of the special effect node by previewing the engineering file.
The implementation process of the functions and actions of each module in the device is specifically described in the implementation process of the corresponding steps in the game engine-based special effect editing method, and is not described herein again.
Optionally, in another exemplary embodiment, the receiving module 350 in the game engine-based special effect editing apparatus may further include, but is not limited to: an action type acquisition unit.
The action type acquisition unit is used for acquiring the action type distributed to the special effect node.
Optionally, in another exemplary embodiment, the game engine-based special effect editing apparatus may further include, but is not limited to: and a meta-action acquisition module.
The meta-action obtaining module is used for obtaining the meta-action assigned to the action type.
Optionally, in another exemplary embodiment, the game engine-based special effect editing apparatus may further include, but is not limited to: and an adjusting module.
The adjusting module is used for adjusting the engineering file according to the special effect.
FIG. 7 is a block diagram of a game engine based effect editing apparatus, which may be used in the implementation environment shown in FIG. 1 to perform all or some of the steps of the game engine based effect editing method shown in FIG. 3, according to another exemplary embodiment. As shown in fig. 7, the special effect editing apparatus includes, but is not limited to: a modification module 361, a combination module 363, a quantity calculation module 365, and a frequency adjustment module 367.
And the correcting module 361 is used for correcting the hierarchy of the special effect node and reducing the drawing frequency.
And the merging module 363 is configured to merge the picture materials.
The number calculating module 365 is configured to calculate the vertex drawing number and the drawing call number of each frame of the graphics processing library.
The frequency adjusting module 367 is configured to obtain an adjustment of the call frequency of the special effect node according to the number of drawings and the number of drawing calls.
It is to be understood that the invention is not limited to the precise arrangements described above and shown in the drawings, and that various modifications and changes may be effected therein by one skilled in the art without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (10)

1. A game engine-based effect editing method, comprising the steps of:
loading a project file by a special effect editor and responding to the project file;
acquiring special effect nodes in the engineering file;
receiving the update made by the user for the special effect node;
displaying the special effect of the special effect node by previewing the engineering file;
in the process of generating preview of the special effect nodes, automatically correcting the levels of the special effect nodes and reducing the drawing frequency, and if the special effect nodes are not mutually overlapped, correcting the special effect nodes to be the same level;
when the engineering file executes preview, the picture materials in the special effect node are merged, the drawing number and the drawing calling times of the vertex of each frame of the graphic processing library are calculated, and the adjustment of the calling frequency of the special effect node according to the drawing number and the drawing calling times is obtained.
2. The method of claim 1, wherein the step of receiving updates made by the user to the special effects node comprises:
and acquiring the action type distributed to the special effect node.
3. The method of claim 2, wherein after the step of obtaining the action type assigned to the special effects node, the method further comprises:
obtaining a meta-action assigned to the action type.
4. The method of claim 1, wherein after the step of displaying the special effects effect of the special effects node by previewing the project file, the method further comprises:
and obtaining the adjustment of the engineering file according to the special effect.
5. The method of claim 1, further comprising:
in the process of generating preview of the special effect node, the special effect editor obtains the activation time of the special effect node;
calculating the execution time of the special effect node, and synchronizing the execution time of the whole special effect according to the activation time and the execution time;
the activation time is a time preset by a user in the special effect editor according to the effect of the special effect node, the execution time is a time from the start of playing the special effect to the end of playing the special effect, and the execution time of the whole special effect is a time obtained by accumulating the execution times.
6. A computer-readable storage medium storing a computer program for electronic data exchange, characterized in that the computer program, when executed, causes a terminal to perform the method according to any of claims 1-5.
7. A terminal, characterized in that the terminal comprises:
a processor; and
a memory communicatively coupled to the processor; wherein the content of the first and second substances,
the memory stores instructions executable by the processor or the processor reads a computer program from a computer readable storage medium, the instructions or the computer program being executable by the processor to enable the processor to perform the method of any of claims 1-5.
8. An effect editing apparatus based on a game engine, the apparatus comprising:
the response module is used for loading the project file by the special effect editor and responding to the project file;
the acquisition module is used for acquiring special effect nodes in the engineering file;
the receiving module is used for receiving the update of the user for the special effect node;
the display module is used for displaying the special effect of the special effect node by previewing the engineering file;
in the process of generating preview of the special effect nodes, automatically correcting the levels of the special effect nodes and reducing the drawing frequency, and if the special effect nodes are not mutually overlapped, correcting the special effect nodes to be the same level;
when the engineering file executes preview, merging picture materials in the special effect nodes, calculating the drawing number and the drawing calling times of the vertexes of each frame of graphic processing library, and obtaining the adjustment of the calling frequency of the special effect nodes according to the drawing number and the drawing calling times.
9. The apparatus of claim 8, wherein the receiving module comprises:
and the action type acquisition unit is used for acquiring the action type distributed to the special effect node.
10. The apparatus of claim 8, further comprising:
the correction module is used for correcting the hierarchy of the special effect node and reducing the drawing frequency;
the merging module is used for merging the picture materials;
the quantity calculation module is used for calculating the vertex drawing number and drawing calling times of each frame of the graphic processing library;
and the frequency adjusting module is used for obtaining the adjustment of the calling frequency of the special effect node according to the drawing number and the drawing calling times.
CN201710387683.4A 2017-05-27 2017-05-27 Special effect editing method and device based on game engine Active CN108287718B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710387683.4A CN108287718B (en) 2017-05-27 2017-05-27 Special effect editing method and device based on game engine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710387683.4A CN108287718B (en) 2017-05-27 2017-05-27 Special effect editing method and device based on game engine

Publications (2)

Publication Number Publication Date
CN108287718A CN108287718A (en) 2018-07-17
CN108287718B true CN108287718B (en) 2022-05-17

Family

ID=62831464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710387683.4A Active CN108287718B (en) 2017-05-27 2017-05-27 Special effect editing method and device based on game engine

Country Status (1)

Country Link
CN (1) CN108287718B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110147231B (en) * 2019-05-23 2021-11-02 腾讯科技(深圳)有限公司 Combined special effect generation method and device and storage medium
CN110704043B (en) * 2019-09-11 2023-07-28 广州方硅信息技术有限公司 Special effect implementation method and device, electronic equipment and storage medium
CN110865809B (en) * 2019-11-14 2023-05-09 珠海金山数字网络科技有限公司 Method and device for importing data into illusion engine
CN113694531B (en) * 2020-05-21 2024-01-19 抖音视界有限公司 Game special effect generation method and device, electronic equipment and computer readable medium
CN113018867A (en) * 2021-03-31 2021-06-25 苏州沁游网络科技有限公司 Special effect file generating and playing method, electronic equipment and storage medium
CN113163259A (en) * 2021-05-10 2021-07-23 宝宝巴士股份有限公司 FFmpeg-based video node rendering method and device
CN116459508A (en) * 2022-01-11 2023-07-21 脸萌有限公司 Special effect prop generation method, picture processing method and device and electronic equipment

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3645024B2 (en) * 1996-02-06 2005-05-11 株式会社ソニー・コンピュータエンタテインメント Drawing apparatus and drawing method
US7027045B2 (en) * 2001-03-16 2006-04-11 Mitsubishi Electric Research Labs Inc. Modeling graphics objects with topological hints
CN103577172A (en) * 2012-07-30 2014-02-12 无锡梵天信息技术股份有限公司 Graphic processing engine system
CN103699379A (en) * 2013-12-13 2014-04-02 福建天趣网络科技有限公司 Visible editing method and editor in game role fighting process
US20160154644A1 (en) * 2014-08-29 2016-06-02 Ram Chhawchharia Real-time previewing and modifying an application under development
CN104392479B (en) * 2014-10-24 2017-05-10 无锡梵天信息技术股份有限公司 Method of carrying out illumination coloring on pixel by using light index number
CN104793927A (en) * 2014-12-30 2015-07-22 北京白鹭时代信息技术有限公司 Interface editing method and device
CN105184847B (en) * 2015-10-16 2017-12-12 上海恺英网络科技有限公司 The rendering intent of 3D game rendering engines
CN105354872B (en) * 2015-11-04 2018-02-27 深圳墨麟科技股份有限公司 A kind of rendering engine based on 3D web games, implementation method and tools

Also Published As

Publication number Publication date
CN108287718A (en) 2018-07-17

Similar Documents

Publication Publication Date Title
CN108287718B (en) Special effect editing method and device based on game engine
CN110134600B (en) Test script recording method, device and storage medium
CN111596912A (en) Non-programming visual construction system and method for radar display control software based on component library
CN111552468B (en) Unity-based prefab editing method and device and storage medium
CN110502415B (en) Buried point setting method, device and equipment
CN102800045A (en) Image processing method and device
US20220179642A1 (en) Software code change method and apparatus
CN108958843A (en) Plug-in unit construction method, system, equipment and medium based on lightweight script
CN113268226A (en) Page data generation method and device, storage medium and equipment
US20230418562A1 (en) Interactive graphic design system to enable creation and use of variant component sets for interactive objects
CN113010359A (en) Bus test system generation method, system, device and storage medium
CN112306480A (en) Visual programming control method, system, device and computer storage medium
CN110096304A (en) Task construction method, device, equipment and storage medium based on Jenkins
CN108984623B (en) Data query condition generation method and device, storage medium and electronic equipment
CN113010157A (en) Code generation method and device
CN115129574A (en) Code testing method and device
CN112328347A (en) Application rule configuration method and device and page processing method
CN116301772A (en) Service code development method, device, equipment and medium
US20060041324A1 (en) System and method of editing a program used for a programmable logic controller
CN114241174A (en) Special effect prop generation method, device, equipment and medium
CN108132782B (en) Automatic programming device and electronic equipment
CN113129806A (en) Display screen replacement method and device
JP2009169628A (en) Construction device, construction method and program for monitoring control system
CN111767063A (en) Resource updating method, device and equipment for application program
CN113838171B (en) Data processing method, data processing device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant