CN117372590A - Particle special effect generation method, device, equipment and storage medium - Google Patents

Particle special effect generation method, device, equipment and storage medium Download PDF

Info

Publication number
CN117372590A
CN117372590A CN202210769963.2A CN202210769963A CN117372590A CN 117372590 A CN117372590 A CN 117372590A CN 202210769963 A CN202210769963 A CN 202210769963A CN 117372590 A CN117372590 A CN 117372590A
Authority
CN
China
Prior art keywords
particle
time
particles
target
candidate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210769963.2A
Other languages
Chinese (zh)
Inventor
阎逸飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210769963.2A priority Critical patent/CN117372590A/en
Publication of CN117372590A publication Critical patent/CN117372590A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering

Abstract

The application discloses a method, a device, equipment and a storage medium for generating a particle special effect, belongs to the technical field of computers, and can be applied to vehicle-mounted scenes. The method comprises the following steps: the image processor extracts reference data of target particles from the reference data of candidate particles, wherein the candidate particles are all particles on which special effects of particles of each frame are generated, the target particles are candidate particles meeting selection conditions at first time, and the first time is the generation time of special effects of particles of any frame; based on the reference data of the target particles, rendering data corresponding to the target particles at the first time is obtained; and generating a particle special effect of any frame based on the rendering data corresponding to the target particle at the first time. The particle special effect generation process does not need the participation of a central processing unit, the image processor can automatically generate the particle special effect without waiting for the calculation of the central processing unit, the particle special effect generation efficiency is not limited by the calculation efficiency of the central processing unit, and the particle special effect generation efficiency is improved.

Description

Particle special effect generation method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method, a device, equipment and a storage medium for generating a particle special effect.
Background
With the development of computer technology, particle systems are increasingly used in games and animations. By operating the particle system, particle special effects can be generated. Particle effects are typically used to simulate abstract visual effects in real scenes, for example, particle effects may simulate abstract visual effects such as sparks, knifing lights, lightning, clouds, fog, dust, snow, explosions, smoke, water currents, defoliation, meteor trails, or luminous tracks.
In the related art, the process of generating the particle special effect is as follows: and a central processing unit in the computer equipment calculates data required by generating the special effects of the particles of each frame, the calculated data are sent to an image processor, and the image processor generates the special effects of the particles of each frame according to the calculated data.
In the mode of generating the particle special effects, the central processing unit and the image processor are needed to participate in the generation of each frame of particle special effects, and after the central processing unit finishes data calculation, the image processor can perform the generation of the particle special effects according to the data calculated by the central processing unit, so that the efficiency of generating the particle special effects is limited by the calculation efficiency of the central processing unit, and the efficiency is difficult to effectively improve.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a storage medium for generating a particle special effect, which can be used for improving the generation efficiency of the particle special effect. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for generating a particle special effect, where the method is performed by an image processor in a computer device, and reference data of candidate particles are stored in the image processor, and the method includes:
extracting reference data of target particles from the reference data of the candidate particles, wherein the candidate particles are all particles on which special effects of each frame of particles are generated, the target particles are candidate particles meeting selection conditions at first time, and the first time is the generation time of special effects of any frame of particles;
based on the reference data of the target particles, rendering data corresponding to the target particles at the first time is obtained; and generating the particle special effect of any frame based on the rendering data corresponding to the target particle at the first time.
In another aspect, a device for generating a particle effect is provided, the device comprising:
the extraction unit is used for extracting reference data of target particles from the reference data of candidate particles, wherein the candidate particles are all particles on which special effects of particles of each frame are generated, the target particles are candidate particles meeting selection conditions at first time, and the first time is the generation time of special effects of particles of any frame;
The acquisition unit is used for acquiring rendering data corresponding to the target particles at the first time based on the reference data of the target particles;
and the generating unit is used for generating the particle special effect of any frame based on the rendering data corresponding to the target particle at the first time.
In one possible implementation manner, the reference data of the target particle includes initial rendering data corresponding to the generation time of the target particle; the acquisition unit is configured to calculate, based on the initial rendering data, rendering data corresponding to the target particle at the first time by using rendering data calculation logic, where the rendering data calculation logic is configured to instruct an association relationship between the rendering data corresponding to the target particle at the first time and the initial rendering data.
In one possible implementation, the reference data of the target particle includes rendering data corresponding to each candidate time of the target particle in a life cycle of the target particle; the acquisition unit is used for determining target candidate time matched with the first time in the candidate time; and taking rendering data corresponding to the target particles at the target candidate time as rendering data corresponding to the target particles at the first time.
In one possible implementation, the arbitrary frame particle effect is an initial frame particle effect, and the target particle is a candidate particle whose generation time matches the first time.
In one possible implementation, the special effect of any frame particle is not an initial special effect of the frame particle, the target particle includes a candidate particle whose generated duration is smaller than the survival duration corresponding to the first time, and a candidate particle whose generated time matches the first time.
In one possible implementation, the apparatus further includes:
a determining unit, configured to determine, according to a life cycle of the candidate particle, validity of the candidate particle at the first time; candidate particles that are valid at the first time are taken as the target particles.
In one possible implementation, the rendering data corresponding to the target particle at the first time includes at least one of size data, rotation data, speed data, position data, color data, and map data corresponding to the target particle at the first time.
In one possible implementation, the apparatus further includes:
the receiving unit is used for receiving the reference data of the candidate particles sent by the central processing unit in the computer equipment, wherein the reference data of the candidate particles are obtained by transforming particle data resources by the central processing unit, and the particle data resources are obtained by construction in an offline construction stage;
And the storage unit is used for storing the reference data of the candidate particles.
In one possible implementation manner, the offline construction stage includes an effect editing stage and a resource construction stage, and the particle data resource is constructed in the resource construction stage when the effect editing stage meets a mode setting condition; wherein the satisfaction mode setting condition includes that the particle special effect generation mode set in the effect editing stage is a mode for instructing generation of the particle special effect of each frame by the image processor.
In another aspect, a computer device is provided, where the computer device includes a processor and a memory, where at least one computer program is stored in the memory, where the at least one computer program is loaded and executed by the processor, so that the computer device implements the method for generating a particle special effect described in any of the foregoing.
In another aspect, there is further provided a computer readable storage medium having stored therein at least one computer program loaded and executed by a processor to cause a computer to implement the method for generating a particle special effect described in any of the above.
In another aspect, a computer program product is provided, where the computer program product includes a computer program or computer instructions, where the computer program or the computer instructions are loaded and executed by a processor, to cause a computer to implement a method for generating a particle special effect as described in any one of the above.
The technical scheme provided by the embodiment of the application at least brings the following beneficial effects:
according to the technical scheme provided by the embodiment of the application, the image processor in the computer equipment stores the reference data of all particles according to which the particle special effects of each frame are generated, and the image processor can automatically acquire rendering data required by generating the particle special effects on the basis of the reference data of all the particles, so that the particle special effects are generated. The particle special effect generation process does not need the participation of a central processing unit, the image processor can automatically generate the particle special effect without waiting for the calculation of the central processing unit, the particle special effect generation efficiency is not limited by the calculation efficiency of the central processing unit, and the particle special effect generation efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an implementation environment of a method for generating a particle special effect according to an embodiment of the present application;
FIG. 2 is a flowchart of a method for generating particle effects according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an effects editing page provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of a resource building page provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a particle effect generation process provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a particle effect generation process provided in an embodiment of the present application;
fig. 7 is a schematic diagram of a device for generating particle special effects according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like herein are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
Fig. 1 is a schematic diagram of an implementation environment of a method for generating a particle special effect according to an embodiment of the present application. The implementation environment may include: a computer device 11.
The computer device 11 is provided with an application program or a webpage capable of displaying the particle special effects, and when the application program or the webpage needs to display the particle special effects, the method provided by the embodiment of the application can be used for generating and displaying the particle special effects.
By way of example, an application program capable of exhibiting particle effects may refer to a program that needs to be downloaded and installed, or may refer to an embedded program that runs in dependence on a host program, which is not limited in the embodiments of the present application. An embedded program is an application program that is developed based on a programming language and runs in dependence on a host program. The embedded program does not need to be downloaded and installed, and can be operated by only being dynamically loaded in the host program. The user can find the needed embedded program by searching, sweeping, and the like, and can apply the embedded program by clicking, so that the embedded program does not occupy the memory of the terminal after closing the embedded program after the embedded program is used up, and is quite convenient.
By way of example, applications capable of exhibiting particle effects may include, but are not limited to, VR (Virtual Reality) class applications, AR (Augmented Reality) class applications, three-dimensional map applications, game class applications, social class applications, interactive entertainment class applications, and the like. Illustratively, game-like applications include, but are not limited to, shooting games, MOBA (Multiplayer Online Battle Arena, multiplayer online tactical Game) games, SLG (strategy Game), and the like.
Alternatively, the computer device 11 may be any electronic product that can perform man-machine interaction with a user through one or more modes of a keyboard, a touch pad, a touch screen, a remote controller, a voice interaction or handwriting device, such as a PC (Personal Computer ), a mobile phone, a smart phone, a PDA (Personal Digital Assistant ), a wearable device, a PPC (Pocket PC), a tablet computer, a smart car machine, a smart voice interaction device, a smart home appliance (e.g., a smart television, a smart speaker, etc.), an aircraft, a car terminal, a VR device, an AR device, etc. For example, when the computer device 11 is an in-vehicle terminal, the method provided in the embodiment of the present application may be applied to an in-vehicle scene.
Illustratively, the computer device 11 includes a central processing unit (Central Processing Unit, CPU) and an image processor (Graphics Processing Unit, GPU). The central processing unit is mainly used for processing data in the running process of the computer equipment 11, and the image processing unit is mainly used for rendering and drawing contents required to be displayed by the display screen. Based on the particle special effect generation method provided by the embodiment of the application, the central processor in the computer equipment 11 can enable the image processor in the computer equipment 11 to automatically complete the particle special effect generation process by sending data to the image processor once in advance, so that the particle special effect generation efficiency is improved.
Those skilled in the art will appreciate that the above-described computer device 11 is by way of example only, and that other computer devices, now known or hereafter developed, may be suitable for use in the present application and are intended to be within the scope of the present application and are incorporated herein by reference.
The method for generating the particle special effect is used for generating the particle special effect, and the particle special effect is generally used for simulating an abstract visual effect under a real scene, for example, the particle special effect can simulate an abstract visual effect such as spark, knife light, thunder, cloud, fog, dust, snow, explosion, smoke, water flow, fallen leaves, meteor trail or luminous track. The particles are basic units on which particle special effects are generated, and are tiny objects, and the tiny objects are enabled to move according to a certain algorithm, so that the particle special effects can be displayed. Particle special effects can be realized by running a particle system, which refers to a technology for simulating an abstract visual effect through particles in computer graphics, and the particle system can prescribe how to control the particles in the life cycle of particle generation, movement, change and disappearance so as to simulate the abstract visual effect, thereby presenting the particle special effects.
The particle effects are generated frame by frame, each frame of particle effects having a generation time. The time interval between the generation times of every two adjacent frames of particle special effects is a reference time interval, which is set empirically or flexibly adjusted according to application scenarios, and the embodiment of the application is not limited to this, for example, the reference time interval is 16ms (milliseconds), and of course, the reference time interval may also be other values. For example, the generation time of the particle special effects per frame may be determined from the start-up time of the particle system. Illustratively, the start-up time of the particle system is taken as the generation time of the first frame particle effect, and the sum of the start-up time of the particle system and the first duration is taken as the generation time of the nth (N is an integer not less than 1) frame particle effect. Wherein the first duration refers to the product of the reference time interval and (N-1). Wherein, the starting operation time of the particle system refers to the time when the computer equipment starts to operate the particle system. That is, the start-up time of the particle system and the generation time of the particle effects of each frame are real time on the computer device.
The embodiment of the application provides a method for generating special effects of particles, which can be applied to the implementation environment shown in fig. 1, and is applied to an image processor in a computer device 11, wherein reference data of candidate particles are stored in the image processor. As shown in fig. 2, the method for generating a particle special effect provided in the embodiment of the present application may include the following steps 201 and 202.
In step 201, reference data of target particles are extracted from reference data of candidate particles, the candidate particles are all particles on which special effects of each frame of particles are generated, the target particles are candidate particles meeting selection conditions at a first time, and the first time is a generation time of special effects of any frame of particles.
In this embodiment of the present application, the image processor stores reference data of candidate particles in advance, where the candidate particles refer to all particles according to which special effects of each frame of particles are generated, and the number of candidate particles and properties of the candidate particles may be set by a technician according to actual requirements, which is not limited in this embodiment of the present application. By way of example, a candidate particle may be considered as all particles involved in the life cycle of the particle system, which refers to the period between the start of the particle system operation and the end of the particle system operation.
The reference data of the candidate particles is data according to which rendering data of the candidate particles corresponding to each candidate time in the life cycle of the candidate particles is obtained, and the rendering data of the candidate particles is data related to the candidate particles according to which particle special effects are generated.
The embodiment of the application does not limit the situation of the data included in the reference data of the candidate particles, as long as the rendering data corresponding to each candidate time of the candidate particles can be determined according to the reference data of the candidate particles.
In an exemplary embodiment, the reference data of the candidate particle includes initial rendering data corresponding to the candidate particle at the generation time of the candidate particle, which may occur in a case where an image processor in the computer device is able to learn an association relationship between the rendering data of the candidate particle at any time and the initial rendering data and the generation time.
In an exemplary embodiment, the reference data of the candidate particles includes rendering data of the candidate particles respectively corresponding at each candidate time. Each candidate time is the time of the starting running time of a plurality of relative particle systems in the life cycle of the candidate particles, each candidate time is obtained by sampling at the special effect updating frequency, and the time difference between every two adjacent candidate times in each candidate time is the time difference between the generation times of two adjacent particle special effects. That is, each candidate time corresponds to a generation time of a frame of particle special effects. Illustratively, the generation time of each candidate time corresponding to one frame of particle special effects refers to the generation time of each candidate time and the starting running time of the particle system, which are both one frame of particle special effects.
In an exemplary embodiment, the reference data of the candidate particles is sent by a central processor in the computer device to the image processor, that is to say, before performing step 201, further comprises: and receiving the reference data of the candidate particles sent by a central processing unit in the computer equipment, and storing the reference data of the candidate particles. The central processing unit sends the reference data of the candidate particles to the image processor before the image processor generates the initial frame particle effect (namely the first frame particle effect) in the frame particle effects, so that the generation process of each frame particle effect can be executed by the image processor, and the generation efficiency of the particle effect is improved. The image processor may store the reference data of the candidate particles in a memory space of the image processor upon receiving the reference data of the candidate particles, so as to facilitate extraction and use in generating the particle special effects.
The candidate particle reference data is obtained by transforming the particle data resource by the central processing unit, and the particle data resource is obtained by constructing in an off-line construction stage. Illustratively, the offline build phase refers to a phase of building a particle system with a development tool, which is referred to as an offline build phase because the phase is not a true online run phase. The embodiment of the application does not limit the development tools used for constructing the particle system, and can be flexibly selected according to actual requirements. Illustratively, the development tool used for constructing the particle system may be referred to as a Unity engine (a game engine), a Laya engine (a game engine), a developer tool provided by an application program for developing an embedded program, or the like. Among other things, the developer tool can help the developer develop embedded programs simply and efficiently. The process of constructing the particle system using the development tool will be described below and will not be described in detail herein.
The particle data resource is constructed based on the reference data of the candidate particles, and the particle data resource is constructed in the offline construction stage so as to transform the reference data of the candidate particles into a form which is convenient for loading by a central processor of computer equipment, and the particle data resource is obtained after transformation. The reference data of the candidate particles are the data which needs to be used in the process of generating the particle special effect, so that the central processing unit needs to transform the particle data resources after loading the particle data resources so as to restore the particle data resources to a form convenient to use, and the transformed reference data is the reference data of the candidate particles. Illustratively, the process of transforming the particle data resource and the process of building the particle data resource are reciprocal processes. Illustratively, the process of constructing the particle data resource may also be referred to as serializing the reference data of the candidate particle, and the process of transforming the particle data resource may also be referred to as deserializing the particle data resource.
In an exemplary embodiment, the offline construction stage includes an effect editing stage and a resource construction stage, and the particle data resource is constructed in the resource construction stage when the effect editing stage satisfies the mode setting condition. Wherein satisfying the mode setting condition includes that the particle effect generation mode set in the effect editing stage is a mode for instructing generation of particle effects of each frame by the image processor.
The effect editing stage is used for editing the effect of the particle special effect and setting a particle special effect generation mode. The particle data resource is constructed in the resource construction stage in the case where the particle special effect generation mode set in the effect editing stage is a mode for instructing generation of particle special effects of each frame by the image processor. The names for indicating the modes for generating the special effects of the particles of each frame by the image processor are set empirically or flexibly adjusted according to the application scene, for example, the modes for indicating the special effects of the particles of each frame by the image processor may be referred to as "baking modes". That is, the generation method of particle effects provided in the embodiment of the present application is implemented on the basis that the particle effect generation mode set in the effect editing stage is a mode for instructing generation of particle effects of each frame by the image processor.
Since the candidate particles are all particles on which the frame particle effect is generated, when the reference data of the candidate particles is stored in the image processor, no matter which frame particle effect needs to be generated, the reference data of the particles on which the frame particle effect is generated can be extracted from the reference data of the candidate particles, and the generation of the frame particle effect can be further realized.
The principle of generating the special effects of each frame of particles is the same, and the process of generating the special effects of any frame of particles is taken as an example for explanation in the embodiment of the application. In generating the special effects of any frame of particles, the generation time of the special effects of any frame of particles is determined first, and the special effects of any frame of particles are generated according to the formula t 0 Calculating generation time of special effects of particles in any frame by using + (n-1) delta t, wherein t 0 Representing a start-up time of the particle system; n (n is an integer not less than 1) is the label of the special effect of any frame of particles, namely, the special effect of any frame of particles is the special effect of the nth frame of particles; Δt represents the reference time interval.
After determining the generation time of the special effect of any frame of particles, taking the generation time of the special effect of any frame of particles as first time, then determining candidate particles meeting the selection condition at the first time, taking the candidate particles meeting the selection condition at the first time as target particles, and further extracting the reference data of the target particles from the reference data of the candidate particles. Candidate particles (i.e., target particles) that meet the selection criteria at a first time are particles upon which the special effects of any one frame of particles are generated. Illustratively, the number of target particles is plural to ensure the effect of the special effect of any one frame of particles.
In one possible implementation, the manner of determining the target particle includes: determining the effectiveness of the candidate particles at a first time according to the life cycle of the candidate particles; candidate particles that are valid at the first time are taken as target particles. The life cycle of a candidate particle refers to the period from the generation of the candidate particle to the disappearance of the candidate particle. It should be noted that, in the life cycle of any candidate particle, the candidate particle always participates in the generation process of the particle special effect.
For example, the life cycle of the candidate particle may be represented by a time interval, the lower interval limit of which represents the generation time of the candidate particle, and the upper interval limit of which represents the disappearance time of the candidate particle. For example, the life cycle of the candidate particle may also be represented by a generation time and a survival time length, and the survival time length refers to a time difference between the disappearance time of the candidate particle and the generation time of the candidate particle.
In an exemplary embodiment, the life cycle of the candidate particle is obtained in the construction stage of the particle system, and since the starting operation time of the particle system is related to the actual operation condition of the computer device, the generation time of the candidate particle and the disappearance time of the candidate particle may refer to the time relative to the starting operation time of the particle system, so as to ensure the accuracy of the life cycle of the candidate particle no matter what time the particle system starts to operate. For example, if the generation time in the life cycle of the candidate particle is 1s (second), it is explained that the candidate particle is generated at a time of 1s backward from the start operation time of the particle system.
For example, the life cycle of the candidate particle may be pre-stored in the image processor, such that the validity of the candidate particle at the first time can be determined from the life cycle of the candidate particle. Illustratively, the candidate particle is valid at the first time or invalid.
The life cycle of the candidate particle may be part of the reference data of the candidate particle, or may be two kinds of data independent from the reference data of the candidate particle, which is not limited in the embodiment of the present application.
In an exemplary embodiment, determining the validity of a candidate particle at a first time based on a life cycle of the candidate particle comprises: calculating the time of the first time relative to the starting running time of the particle system, and if the calculated time is in the life cycle of the candidate particles, determining that the candidate particles are effective at the first time; and if the calculated time is not in the life cycle of the candidate particles, determining that the candidate particles are invalid at the first time. For example, if the life cycle of the candidate particle is represented by one time interval, the calculation that the time is in the life cycle of the candidate particle may mean that the calculated time falls within the time interval corresponding to the candidate particle, that is, the calculated time is not less than the lower limit of the time interval corresponding to the candidate particle and not greater than the upper limit of the time interval corresponding to the candidate particle. For example, if the life cycle of the candidate particle is represented by a generation time and a life duration, then the calculated time being in the life cycle of the candidate particle may mean that the difference between the calculated time and the generation time is not greater than the life duration.
After determining the validity of the candidate particles at the first time, the candidate particles valid at the first time are taken as target particles, in which case the satisfaction of the selection condition at the first time means that the candidate particles are valid at the first time. Under the implementation mode of determining the target particles, whether the special effect of any frame particle is the initial frame particle is not required to be concerned, so that judgment logic is reduced, and convenience in determining the target particles is improved.
In one possible implementation, the manner of determining the target particle includes: judging whether any frame particle special effect corresponding to the first time is an initial frame particle special effect or not; if any frame particle special effect is the initial frame particle special effect, determining target particles according to the following mode I; if any one of the frame particle effects is not the initial frame particle effect, the target particle is determined according to the following second mode. The initial frame particle effect is the first frame particle effect in the particle effects of each frame corresponding to the particle system. Whether any particle effect is the initial frame particle effect can be judged according to whether the particle effect is generated before any frame particle effect. Under the mode of determining the target particles, the effectiveness of each candidate particle at the first time does not need to be judged, and the calculation amount is saved.
Mode one: and responding to any frame particle effect as an initial frame particle effect, and taking the candidate particles with the generation time matched with the first time as target particles.
When any one of the frame particle effects is the initial frame particle effect, all candidate particles are not generated before the first time, in this case, when determining the particle on which any one of the frame particle effects is generated, only the candidate particles whose generation time matches the first time need to be considered, that is, the candidate particles whose generation time matches the first time are taken as target particles. The candidate particles whose generation time matches the first time refer to candidate particles to be generated at the first time.
Illustratively, generating a candidate particle whose time matches the first time refers to generating a candidate particle whose time is the same as the converted time corresponding to the first time, or generating a candidate particle whose time is the same as the first time corresponding to the converted time. The converted time corresponding to the first time is the time consistent with the generated time measurement standard, and the converted time corresponding to the generated time is the time consistent with the first time measurement standard. Illustratively, the generation time refers to a time relative to a start-up time of the particle system, the first time refers to a time on the computer device, and the converted time corresponding to the first time refers to a time relative to the start-up time of the particle system obtained by conversion according to the first time, and the converted time corresponding to the generation time refers to a time on the computer device obtained by conversion based on the generation time and the start-up time of the particle system.
For example, if the start-up time of the particle system is 10:00:00, the generation time of any candidate particle is 10s, and the first time is 10:00:10, the converted time corresponding to the first time is 10s, the converted time corresponding to the generation time of any candidate particle is 10:00:10, and the generation time of any candidate particle is matched with the first time because the converted time corresponding to the generation time of any candidate particle is the same as the first time (or because the generation time of any candidate particle is the same as the converted time corresponding to the first time).
According to the first aspect, when any one of the frame particle effects is the initial frame particle effect, the target particle generates a candidate particle whose time matches the first time. In this case, satisfying the selection condition at the first time means that the generation time matches the first time.
Mode two: and responding to the situation that any frame of particle special effect is not the initial frame of particle special effect, and taking the candidate particles with the generated time smaller than the survival time corresponding to the first time and the candidate particles with the generation time matched with the first time as target particles.
When any frame of particle special effects are not the initial frame of particle special effects, the particle special effects positioned before any frame of particle special effects exist, the particles on which the particle special effects positioned before any frame of particle special effects are generated are all candidate particles generated before the first time, the candidate particles generated before the first time correspond to generated time periods at the first time, and if the generated time period of a certain candidate particle corresponding to the first time is smaller than the survival time period, the candidate particles are indicated to continuously participate in the generation process of a frame of particle special effects corresponding to the first time. Therefore, when determining that the particle on which the special effect of any frame particle is generated is not the special effect of the initial frame particle, not only the candidate particle whose generation time matches the first time but also the candidate particle whose generated time length is smaller than the survival time length corresponding to the first time need to be considered. In this case, the satisfaction of the selection condition at the first time means that the generation time matches the first time, or that the generated duration corresponding to the first time is smaller than the survival duration.
For example, since the candidate particles whose generation time matches the first time are among the candidate particles not generated before the first time, the candidate particles whose generation time is shorter than the survival time corresponding to the first time are among the candidate particles generated before the first time, the candidate particles whose generation time is shorter than the survival time corresponding to the first time can be determined from the first reference particles, and the candidate particles whose generation time matches the first time can be determined from the second reference particles. Wherein the first reference particles refer to candidate particles that have been generated before the first time and the second reference particles refer to candidate particles that have not been generated before the first time.
Illustratively, candidate particles that were not generated prior to the first time refer to candidate particles that were generated at a time that is not earlier than the corresponding converted time of the first time, or to candidate particles that were generated at a time that is not earlier than the corresponding converted time of the first time; the candidate particles that have been generated before the first time refer to candidate particles whose generation time is earlier than the converted time corresponding to the first time, or to candidate particles whose generation time corresponds to the converted time earlier than the first time.
For example, all candidate particles that have been generated before the first time may be used as the first reference particles, or candidate particles that have been generated before the first time and satisfy the screening condition may be used as the first reference particles.
For example, candidate particles that have been generated before the first time that satisfy the screening condition may refer to particles on which particle effects of a frame preceding any frame of particle effects are generated. Because the particle special effects are generated frame by frame, and particles which are invalid in the generation time of the particle special effects of the next frame in the particles according to the particle special effects of the previous frame in the two adjacent frames are removed in the process of generating the particle special effects of the next frame, the particles which are invalid in the generation time of the particle special effects of the next frame are particles which have the generated time duration which is not less than the survival time duration and corresponds to the generation time of the particle special effects of the next frame. Therefore, the particles which are generated before the first time and are except the particles which are used for generating the particle special effect of the frame which is positioned before the particle special effect of any frame are all particles which are invalid at the first time, and the survival time length do not need to be compared. The particles on which the particle special effect of the previous frame of the particle special effect of any frame is generated are used as the first reference particles, so that the number of the particles needing to be compared with the survival time length can be effectively reduced, and the efficiency of determining the target particles is improved.
For example, after determining the first reference particle, it may be determined whether a generated duration of the first reference particle corresponding to the first time is less than a lifetime of the first reference particle. For example, if the generated duration of the first reference particle at the first time is smaller than the survival duration of the first reference particle, it is indicated that the first reference particle is still valid at the first time, and the generation process of the special effect of the first frame particle corresponding to the first time can be participated; if the generated duration of the first reference particle corresponding to the first time is not less than the survival duration of the first reference particle, the first reference particle is invalid in the first time, and the generation process of the particle special effect of a frame corresponding to the first time is not participated.
The lifetime of the first reference particle is exemplified by the data recorded in the lifetime of the first reference particle, or can be calculated from the time interval corresponding to the lifetime of the first reference particle. The image processor may record, for example, a time at which the first reference particle is actually generated, the time coinciding with a measure of the first time, in which case a time difference between the first time and the time at which the first reference particle is actually generated may be taken as a generated duration of the first reference particle corresponding to the first time. In an exemplary embodiment, the image processor records a generation time of the first reference particle, which is a time relative to a start operation time of the particle system, in which case the first time may be converted into a time relative to the start operation time of the particle system, and a time difference between the converted time and the generation time of the first reference particle is taken as a generated duration of the first reference particle corresponding to the first time.
By comparing the generated time length of the first reference particle corresponding to the first time with the survival time length of the first reference particle, candidate particles with the generated time length smaller than the survival time length corresponding to the first time can be determined from the first reference particles. The implementation principle of determining the candidate particles with the generation time matched with the first time from the second reference particles is the same as that of determining the candidate particles with the generation time matched with the first time in the first mode, and is not described herein.
It should be noted that, in the embodiment of the present application, when any frame particle special effect is not the initial frame particle special effect, there are both candidate particles whose generation time matches the first time and candidate particles whose generated duration is less than the survival duration corresponding to the first time. In some embodiments, when any frame particle effect is not the initial frame particle effect, there may be only candidate particles whose generation time matches the first time, or only candidate particles whose generated duration is less than the survival duration corresponding to the first time. If only candidate particles with the generation time matched with the first time exist, the target particles generate candidate particles with the generation time matched with the first time, and the fact that the first time meets the selection condition means that the generation time is matched with the first time; if only the candidate particles with the generated time length smaller than the survival time length corresponding to the first time exist, the target particles are the candidate particles with the generated time length smaller than the survival time length corresponding to the first time, and the condition that the selection condition is met at the first time means that the generated time length corresponding to the first time is smaller than the survival time length.
After the target particle is determined, the reference data of the target particle may be extracted from the reference data of the candidate particle, and step 202 is further performed.
In step 202, based on the reference data of the target particle, rendering data corresponding to the target particle at the first time is obtained; and generating a particle special effect of any frame based on the rendering data corresponding to the target particle at the first time.
The rendering data corresponding to the target particle at the first time is data according to which a frame of particle special effects corresponding to the first time (that is, any frame of particle special effects mentioned in the embodiment of the application) is generated. The number of the target particles is one or more, each target particle corresponds to rendering data at the first time, and the rendering data corresponding to different target particles at the first time includes the same type of data.
For any one target particle, rendering data corresponding to the target particle at the first time is used for constraining the expression form of the target particle in a frame of particle special effects corresponding to the first time. The rendering data corresponding to the target particle at the first time may include a plurality of types of data, each type of data being used to constrain a representation of the target particle in terms of an attribute in a frame of particle effects corresponding to the first time. The type of the data included in the rendering data corresponding to the target particle at the first time is set empirically, or flexibly adjusted according to the application scene, which is not limited in the embodiment of the present application. Illustratively, the plurality of types of data may also be referred to as data of a plurality of attributes.
In an exemplary embodiment, the rendering data corresponding to the target particle at the first time includes at least one of size data, rotation data, speed data, position data, color data, map data corresponding to the target particle at the first time.
The Size data corresponding to the first time is used for restraining the expression form of the Size (Size) of the target particle in the one-frame particle special effect corresponding to the first time, that is, according to the Size data corresponding to the first time, the image processor can know the shape and the Size of the target particle in the one-frame particle special effect corresponding to the first time. By way of example, the size data corresponding to the target particle at the first time may include shape data and size data, e.g., circular with a radius of 0.1 cm, square with a side length of 0.1 cm, etc.
The Rotation data corresponding to the first time of the target particle is used for restraining the expression form of the target particle in Rotation (Rotation) in a frame of particle special effects corresponding to the first time, namely according to the Rotation data corresponding to the first time of the target particle, the image processor can know what the Rotation state of the target particle in the frame of particle special effects corresponding to the first time is. For example, the rotation data of the target particle corresponding to the first time may include rotation angular velocities in respective reference directions in the three-dimensional space, rotation angular velocities in respective reference directions in the two-dimensional space, and the like.
The Speed data corresponding to the first time of the target particle is used for restraining the expression form of the target particle in the Speed (Speed) of one frame of particle special effects corresponding to the first time, namely, according to the Speed data corresponding to the first time of the target particle, the image processor can know what the running Speed of the target particle in one frame of particle special effects corresponding to the first time is. For example, the velocity data of the target particle corresponding to the first time may include velocities in respective reference directions in the three-dimensional space, velocities in respective reference directions in the two-dimensional space, and the like.
The Position data of the target particle corresponding to the first time is used for restraining the expression form of the target particle in the Position (Position) of the frame of particle special effects corresponding to the first time, namely the image processor can know the Position of the target particle in the frame of particle special effects corresponding to the first time according to the Position data of the target particle corresponding to the first time. For example, the position data corresponding to the target particle at the first time may include three-dimensional coordinates, two-dimensional coordinates, and the like.
The Color data corresponding to the first time of the target particle is used for restraining the expression form of the target particle in the Color (Color) of the frame of particle special effects corresponding to the first time, that is, according to the Color data corresponding to the first time of the target particle, the image processor can know the Color, such as red, yellow and the like, of the target particle in the frame of particle special effects corresponding to the first time.
The mapping data corresponding to the target particle at the first time is used for restraining the expression form of the mapping aspect of the target particle in the frame of particle special effect corresponding to the first time, and according to the mapping data corresponding to the target particle at the first time, the image processor can obtain the mapping condition corresponding to the target particle in the frame of particle special effect corresponding to the first time, and the mapping condition can be represented by using mapping coordinates (such as UV mapping coordinates).
In an exemplary embodiment, the implementation manner of obtaining the rendering data corresponding to the target particle at the first time is related to the case of data included in the reference data of the target particle based on the reference data of the target particle, and the implementation manner of obtaining the rendering data corresponding to the target particle at the first time is different based on the reference data of the target particle when the case of data included in the reference data of the target particle is different.
In one possible implementation, the reference data of the target particle includes initial rendering data corresponding to the target particle at a generation time of the target particle. In this case, based on the reference data of the target particle, the implementation manner of obtaining the rendering data corresponding to the target particle at the first time includes: based on the initial rendering data, rendering data corresponding to the target particles at the first time are calculated by using rendering data calculation logic, and the rendering data calculation logic is used for indicating the association relation between the rendering data corresponding to the target particles at the first time and the initial rendering data. In the implementation mode of acquiring the rendering data corresponding to the target particle at the first time, the image processor only needs to store the rendering data corresponding to the target particle at the generation time, and does not need to store other rendering data corresponding to the target particle at other times in the life cycle after the generation, so that the storage resource is saved.
The rendering data calculation logic is used for indicating the association relation between the rendering data corresponding to the target particle at the first time and the initial rendering data, so that after the initial rendering data of the target particle is extracted from the reference data of the target particle, the association relation indicated by the rendering data calculation logic can be calculated to obtain the rendering data corresponding to the target particle at the first time. The representation form of the rendering data calculation logic is not limited in the embodiment of the present application, as long as the association relationship between the rendering data corresponding to the target particle at the first time and the initial rendering data can be indicated, and the rendering data calculation logic may be represented by a curve, a formula, a computer program, or the like.
In an exemplary embodiment, when the rendering data corresponding to the target particle at the first time includes a plurality of types of data, the rendering data calculation logic may include calculation logic corresponding to the plurality of types of data, where any type of data corresponding to the target particle at the first time is calculated according to the calculation logic corresponding to any type of data. For example, in a case where the rendering data corresponding to the target particle at the first time includes at least one of size data, rotation data, speed data, position data, color data, and map data corresponding to the target particle at the first time, the rendering data calculation logic includes at least one of size calculation logic corresponding to the size data, rotation calculation logic corresponding to the rotation data, speed calculation logic corresponding to the speed data, position calculation logic corresponding to the position data, color calculation logic corresponding to the color data, and map calculation logic corresponding to the map data.
For example, taking data belonging to any type in the initial rendering data corresponding to the target particle as an example, calculating the data belonging to any type in the rendering data corresponding to the first time by using calculation logic corresponding to the data of any type in the rendering data calculation logic, so as to facilitate understanding. The computational logic corresponding to any type of data can be considered approximately y=at+b, where y represents the data belonging to that type in the rendered data to which the particle corresponds at any time; a represents a transformation formula, which may be a constant or a curve calculation formula corresponding to any type of data; b represents data belonging to any type in initial rendering data corresponding to the particles; t represents the generated time period corresponding to the particle at any time. Wherein a is known data, and t, b and y are unknown data. Calculating the value of t according to the first time and the generation time of the target particles, taking the data belonging to any type in the initial rendering data corresponding to the target particles as the value of b, and substituting the value of t and the value of b into y=at+b to obtain the value of y, wherein the calculated value of y is the data belonging to any type in the rendering data corresponding to the target particles at the first time because a is the known data.
For example, different target particles may correspond to the same rendering data calculation logic, or may correspond to different rendering data calculation logic, which is not limited in the embodiments of the present application. Wherein, different target particles correspond to the same rendering data calculation logic, which is beneficial to saving storage resources, and different target particles correspond to different rendering data calculation logic, which is beneficial to improving the calculation flexibility of rendering data.
For example, the rendering data calculation logic may be pre-stored in the image processor, the rendering data calculation logic being acquired by a central processor in the computer device and sent to the image processor. Illustratively, the rendering data calculation logic is transformed by a central processor to calculate resources, which are built in an offline build phase.
Illustratively, the computing resource is constructed based on rendering data computing logic, and the computing resource is constructed in an offline construction stage to transform the rendering data computing logic into a form convenient for loading by a central processor of the computer device, and the computing resource is obtained after the transformation. Because the rendering data calculation logic is used for acquiring the rendering data, the central processing unit needs to transform the computing resource after loading the data into the computing resource so as to restore the computing resource to a form convenient to use, and the rendering data calculation logic is obtained after transformation. Illustratively, the process of transforming the computing resource and the process of building the computing resource are reciprocal processes. Illustratively, the process of building a computing resource may also be referred to as serializing the rendering data computation logic, and the process of transforming the computing resource may also be referred to as deserializing the computing resource.
The computing resources are two different resources than the particle data resources described above. For example, the computing resources and the particle data resources may be recorded in different buffers (buffers) to facilitate loading by the central processor. The data recording forms of the different buffers may be the same or different. For example, the particle data resource is recorded in a first buffer, the data recording form of the first buffer is attribute (one data form), the computing resource is recorded in a second buffer, and the data recording form of the second buffer is uniform (one data form).
In another possible implementation, the reference data of the target particle includes rendering data of the target particle corresponding to each candidate time in a life cycle of the target particle. In this case, based on the reference data of the target particle, the implementation manner of obtaining the rendering data corresponding to the target particle at the first time includes: and determining target candidate time matched with the first time in each candidate time, and taking rendering data corresponding to the target particles at the target candidate time as rendering data corresponding to the target particles at the first time. Under the rendering data mode of the target particle at the first time, the rendering data calculation logic is not required to be used for calculation, and the obtaining efficiency is high. By way of example, the accuracy of the rendering data corresponding to the obtained target particle can be effectively ensured under the condition that the rendering data corresponding to the generation time of the particle special effect of the target particle in the following frame depends on the rendering data corresponding to the generation time of the particle special effect of the previous frame.
Each candidate time is a time of a start operation time of a plurality of relative particle systems in a life cycle of the target particle, which is a period from generation of the target particle to disappearance of the target particle. Each candidate time is obtained by sampling at the special effect updating frequency, and the time difference between every two adjacent candidate times in each candidate time is the time difference between the generation times of two adjacent particle special effects. That is, each candidate time corresponds to a generation time of a frame of particle special effects. Illustratively, the generation time of each candidate time corresponding to one frame of particle special effects refers to the generation time of each candidate time and the starting running time of the particle system, which are both one frame of particle special effects.
Illustratively, the manner in which the target candidate time that matches the first time is determined among the candidate times may be: and calculating the time of the first time relative to the starting running time of the particle system, and taking the time which is the same as the calculated time in each candidate time as the target candidate time. Wherein the time of the first time relative to the start-up time of the particle system refers to the difference between the first time and the start-up time of the particle system.
Illustratively, the manner in which the target candidate time that matches the first time is determined among the candidate times may also be: and taking the sum of any candidate time and the starting running time of the particle system as the conversion time corresponding to any candidate time, and taking the candidate time with the same conversion time corresponding to each candidate time as the first time as the target candidate time.
Since the target candidate time is one of a plurality of candidate times and the reference data of the target particles includes rendering data corresponding to each candidate time of the target particles, after the target candidate time is determined, the rendering data corresponding to the target particles at the target candidate time can be extracted from the reference data of the target particles, and the rendering data corresponding to the target particles at the target candidate time is used as the rendering data corresponding to the target particles at the first time.
After the rendering data corresponding to the target particles at the first time is obtained, generating a particle special effect of any frame based on the rendering data corresponding to the target particles at the first time. The special effect of any frame of particles is a special effect of one frame of particles with the generation time being the first time.
In an exemplary embodiment, the process of generating the special effects of any frame of particles based on the rendering data corresponding to the target particles at the first time may be: filling pixel values of pixel points corresponding to the target particles in a blank image according to rendering data corresponding to the target particles at the first time; and filling the pixel values of the unfilled pixel points in an interpolation mode according to the pixel values of the filled pixel points, and taking the whole filled image as the special effect of any frame particle. Exemplary ways of interpolation include, but are not limited to, linear interpolation, bilinear interpolation, and the like.
The generation of particle effects may be implemented by a Shader (loader) in the image processor, for example. A shader can be considered a program written in a shading language that runs on an image processor to tell graphics software how to compute and output images. Illustratively, shaders may include Vertex shaders (Vertex shaders) and Fragment shaders (Fragment shaders). The vertex shader is a program for processing vertex data, and in the embodiment of the present application, the process of obtaining rendering data corresponding to the target particle at the first time may be implemented by calling the vertex shader. The fragment shader is used to calculate and fill the color of each pixel, and the process of generating the special effect of any frame of particles based on the rendering data corresponding to the target particles at the first time in the embodiment of the application can be realized by calling the fragment shader.
According to the method provided by the embodiment of the invention, the resource construction capability provided by the development tool can be utilized, when the particle system resource is constructed, static data of the particle system is preprocessed into data (such as reference data of candidate particles, rendering data calculation logic and the like) which can be directly used by the GPU during game operation, so that extra CPU calculation is not needed during the operation of the particle system, the particle system is completely changed into GPU calculation and rendering, the CPU time occupied during the operation of the particle system can be reduced, the performance of the particle system during the operation is improved, the generation efficiency of particle special effects is improved, the operation performance of an application program for generating the particle special effects by the particle system is further improved, and the man-machine interaction rate is improved. The method can greatly improve the running performance of the particle system on a real machine on the premise of not bringing extra work to a developer.
According to the particle special effect generation method, the image processor in the computer equipment stores the reference data of all particles according to which the particle special effect of each frame is generated, and the image processor can automatically acquire rendering data required by generating the particle special effect on the basis of the reference data of all the particles, so that the particle special effect is generated. The particle special effect generation process does not need the participation of a central processing unit, the image processor can automatically generate the particle special effect without waiting for the calculation of the central processing unit, the particle special effect generation efficiency is not limited by the calculation efficiency of the central processing unit, and the particle special effect generation efficiency is improved.
In an exemplary embodiment, the process of building a particle system using a development tool includes two processes, the first process being: the technicians edit and produce the desired particle special effects using development tools. The second process is: the technician utilizes the development tool to build the resources needed to achieve the desired particle effect, which may also be referred to as resources needed for the operation of the real machine, since the resources are those utilized when running on the computer device.
Illustratively, the process of editing by a technician using a development tool to produce the desired particle effect is implemented by the technician setting corresponding data on an editing page provided by the development tool. The type of data to be set in the editing page is set empirically or flexibly adjusted according to the actual application product of particle special effects, which is not limited in the embodiment of the present application. By way of example, the data that needs to be set may include data, material dimensions, size types, patterns, etc. of candidate modules in the particle system. The types of the candidate modules of the data to be adjusted can be set empirically, and can be flexibly adjusted according to application scenes, and the embodiment of the application is not limited to the above. Illustratively, the types of candidate modules include at least one of a generic module, a generator module, a rendering module, a generator shape module, a sub-generator module, a size module, a rotation module, a color module, a speed limit module, and a map animation module. The combination of candidate modules enables different particle effects.
Illustratively, a universal (Common) module is used to set basic properties of the particles, such as initial velocity, initial rotation, initial color, duration of life of the particles; a generator (Emitter) module for setting properties related to particle generation, such as generation interval time, particle system duration, whether generation is cyclic, etc.; a rendering (Renderer) module for setting a particle rendering related attribute, such as a rendering mode; a generator Shape (emission Shape) attribute sets the type of the particle generator, and can determine the initial direction, initial position, and the like of the particle; a child generator (Sub emutters) module for creating additional child generators at particle locations at certain stages of the particle lifecycle, needs to configure a list of child generators and select their trigger conditions and their properties inherited from parent particles; the Size (Size By Life) module is used for setting the attribute related to the Size change in the Life cycle of the particles, such as the change of the sizes of the X axis, the Y axis and the Z axis of the particles in the Life cycle; a Rotation By Life module is used for setting the property related to Rotation variation in the Life cycle of the particles, such as the angular speed of Rotation in the directions of X axis, Y axis and Z axis; the Color By Life module is used for setting the attribute of Color change in the Life cycle of the particles; a Speed By Life module for setting a property related to a change in movement Speed in a Life cycle of the particles, such as speeds in directions of an X axis, a Y axis, and a Z axis; a speed limit (Speed Limit By Life) module for setting a property related to a speed limit of movement in a life cycle of the particle, such as a lower limit value of speed of an X axis, a Y axis, a Z axis, etc., the speed limit module and the speed module together determining a speed of the particle; the map animation (Texture Sheet Animation) module is configured to set a map animation-related attribute, such as an animation mode, an animation type, a change in a number of frames in which an animation is located, an initial frame of an animation, a number of times an animation sequence is repeated throughout a life cycle of a particle, and the like.
For example, as shown in fig. 3, the effect editing page provided by the development tool corresponds to one expansion control and one selection control (for example, the universal module corresponds to the expansion control 301 and the selection control 302), the attribute related to any candidate module can be set by triggering the expansion control corresponding to the candidate module, and the data already set in any candidate module can be considered in the running process of the particle system by triggering the selection control corresponding to any candidate module. Illustratively, in the edit page shown in FIG. 3, in addition to selecting one or more candidate modules for editing, it is also necessary to edit the texture size, source address of texture type, etc. of the particles. Illustratively, in the editing page, a Particle play (Particle Player) control is further provided, where the Particle play control has functions of playing and stopping playing the Particle effect to implement the preview of the Particle effect, and in the process of playing the Particle effect, the current number of particles (Particle Count) displayed in the editing page and the play Time (play back Time) from the start of the Particle system are displayed. For example, in previewing the special effects of the particles, whether only the selected particles are played may be selected, and the selected particles may be preset.
Illustratively, on the effects edit page provided by the development tool, an option is provided whether to use a baking mode, i.e., a mode for indicating the special effects of each frame of particles generated by the image processor, which can improve the running performance of the particle system on a real machine. The method for generating the particle special effect provided in the embodiment of the present application refers to a method in the case of using the baking mode, that is, if a technician selects a control corresponding to the baking mode, it is indicated that the baking mode needs to be started, that is, it is indicated that the calculation process for generating the particle special effect is completed by all image processors in the computer device.
Illustratively, after the edit-making of the particle effect is completed, a second process of building the particle system (i.e., a process in which a technician builds the resources needed to achieve the desired particle effect using a development tool) may be performed. Illustratively, the process of building resources is related to the type of development tool and the actual build requirements, which embodiments of the present application do not limit.
Illustratively, the process of building resources may be implemented in accordance with the resource build page as shown in FIG. 4. In the resource build page shown in fig. 4, a technician needs to select whether to build only a script, a resource of a 3D portal scene, a resource of a 2D portal scene, a remote resource upload mode, whether to place a head scene remotely, whether to execute a step of checking whether to meet an upload specification after construction, whether to clear a build cache, and the like. In the resource building page shown in fig. 4, a prompt message "debug using lan, tool and true machine must be in the same wifi", where the tool refers to a development tool, the true machine refers to a computer device running a particle system, and wifi is known as Wireless Fidelity (wireless fidelity). The resources of the 3D entry scene and the resources of the 2D entry scene are used to indicate a three-dimensional scene and a two-dimensional scene, respectively, upon which the particle special effects are generated.
By filling in the relevant information in the resource building page shown in fig. 4 and triggering the determination control, the development tool can automatically build the resources required to achieve the desired particle special effect (i.e., the resources required by the particle system when running on a real machine). Under the condition of using the baking mode, the resources to be constructed at least comprise particle data resources and computing resources, wherein the particle data resources are used for a central processing unit to acquire reference data of candidate particles and send the reference data to an image processor, and the computing resources are used for the central processing unit to acquire rendering data computing logic and send the computing logic to the image processor.
The particle system is an embedded program in the target application program, after the resources of the particle system and other resources of the embedded program are built, a two-dimensional code of the embedded program can be generated, the built embedded program can be operated (preview operation or formal operation, etc.) by scanning the two-dimensional code by using the target application program installed in the computer equipment, and in the process of operating the embedded program, as a baking mode is selected in an effect editing page, particle special effects are generated according to the method provided by the embodiment of the application, namely, the calculation and rendering of the particle system are all completed in the image processor, and no additional central processor is used for calculating in the operation process of the particle system, so that the performance in the operation process is effectively improved.
For example, in a case where the mode setting condition is satisfied in the effect editing stage, that is, the particle effect generation mode set in the effect editing stage is a mode for instructing generation of particle effects of each frame by the image processor (e.g., in the effect editing page shown in fig. 3, the baking mode is selected for use), the generation process of the particle effect may be as shown in fig. 5. In the offline construction stage of the particle system, acquiring particle system data according to particle system effects set by a developer; constructing particle special effect resources according to the particle system data, and assembling the data of each calculation module into a first buffer in the process of constructing the particle special effect resources, wherein the data of each calculation module refers to rendering data calculation logic; the reference data of all particles (candidate particles) generated in the life cycle of the particle system are calculated in advance, and the reference data of the candidate particles are assembled and stored in the second buffer. Illustratively, the data in the first buffer is in the form of uniform and the data in the second buffer is in the form of attribute.
When the particle system needs to run on the computer device (i.e. the real machine running stage), the CPU in the computer device initializes the particle system, and a first buffer and a second buffer which are already built are obtained through initialization, rendering data calculation logic is sent to the vertex shader and the fragment shader in the GPU through the first buffer, and reference data of candidate particles is sent to the vertex shader in the GPU through the second buffer.
After the GPU receives the reference data and the rendering data calculation logic of the candidate particles, when any frame of particle special effects need to be generated, detecting whether the generation time of the candidate particles in any frame of particle special effects meets the selection conditions or not by using the vertex shader, discarding the particles which do not meet the selection conditions, and calculating the rendering data corresponding to the target particles at the first time by using the rendering data calculation logic based on the reference data of the particles (target particles) which meet the selection conditions. The rendering data calculation logic includes size calculation logic, rotation calculation logic, speed calculation logic, position calculation logic, color calculation logic, and map calculation logic. After the rendering data corresponding to the target particles at the first time is obtained, generating a frame of particle special effects based on the rendering data corresponding to the target particles at the first time by utilizing a fragment shader, and displaying the generated particle special effects on a display screen.
It should be noted that, in the above embodiment, the description was given of the manner in which the effect editing stage satisfies the mode setting condition, that is, the particle effect is generated in the case where the particle effect generation mode set in the effect editing stage is the mode for instructing generation of particle effects of each frame by the image processor (for example, in the effect editing page shown in fig. 3, the baking mode is selected to be used). In some embodiments, the effects editing stage may also not satisfy the mode setting condition, i.e., the particle effect generation mode set in the effects editing stage is not a mode for indicating generation of particle effects of each frame by the image processor (e.g., in the effects editing page shown in fig. 3, the baking mode is not selected for use). In this case, the process of generating the particle special effect of any frame can be realized through interaction between the central processing unit and the image processor, wherein the central processing unit is used for executing operations with smaller calculation amount, and the image processor is used for executing operations with larger calculation amount, so that the generation efficiency of the particle special effect is ensured to a certain extent.
Illustratively, in the case where the mode setting condition may not be satisfied in the effect editing stage, that is, the particle effect generation mode set in the effect editing stage is not a mode for instructing generation of particle effects of each frame by the image processor (e.g., in the effect editing page shown in fig. 3, the baking mode is not selected to be used), the process of generating the particle effect includes: the central processing unit determines the target particles and acquires initial rendering data corresponding to the generation time of the target particles, and sends the initial rendering data corresponding to the generation time of the target particles to the image processor. The image processor receives initial rendering data which is sent by the central processing unit and corresponds to the generation time of the target particles; based on initial rendering data corresponding to the generation time of the target particles, rendering data corresponding to the first time of the target particles are calculated by utilizing rendering data calculation logic, and based on the rendering data corresponding to the first time of the target particles, a frame of particle special effects corresponding to the first time is generated. The target particles are candidate particles meeting the selection condition at the first time.
For example, in a case where the mode setting condition may not be satisfied in the effect editing stage, that is, the particle effect generation mode set in the effect editing stage is not a mode for instructing generation of particle effects of each frame by the image processor (for example, in the effect editing page shown in fig. 3, the baking mode is not selected for use), the generation process of the particle effect may be as shown in fig. 6. The CPU in the computer equipment initializes the particle system, analyzes the data of each calculation module, assembles the data of each calculation module into a first buffer, and sends the data of each calculation model (namely rendering data calculation logic) to the vertex shader and the fragment shader in the GPU through the first buffer. The data in the first buffer is in the form of unitorm.
When any frame of particle special effects need to be generated (the generation time of any frame of particle special effects is the first time), the CPU detects whether new particles need to be generated currently, and if the new particles do not need to be generated currently, the CPU directly jumps to the step of removing the particles which have reached the survival time at the first time from the collection. If new particles are currently required to be generated, initializing the newly generated particles, obtaining initial rendering data (including size, speed, rotation, position and the like) corresponding to the generation time of the newly generated particles, adding the newly generated particles into a particle set, and then executing the step of removing the particles which have reached the survival time at the first time from the set.
After the step of removing the particles which have reached the survival time at the first time from the set is performed, taking the particles reserved in the assembled set as target particles, assembling initial rendering data corresponding to the target particles at the generation time of the target particles to a second buffer, sending the initial rendering data corresponding to the target particles at the generation time of the target particles to a vertex shader in the GPU through the second buffer, and returning to the step of detecting whether new particles need to be generated currently so as to enter a next cycle, wherein the next cycle is used for generating a particle special effect of the next frame. Illustratively, the data in the second buffer is in the form of an attribute.
After the GPU receives the initial rendering data and rendering data calculation logic corresponding to the generation time of the target particles, the rendering data corresponding to the first time of the target particles is calculated by using the rendering data calculation logic based on the initial rendering data corresponding to the generation time of the target particles. The rendering data calculation logic includes size calculation logic, rotation calculation logic, speed calculation logic, position calculation logic, color calculation logic, and map calculation logic. After the rendering data corresponding to the target particles at the first time is obtained, generating a frame of particle special effects based on the rendering data corresponding to the target particles at the first time by utilizing a fragment shader, and displaying the generated particle special effects on a display screen.
In the process of generating the particle special effect shown in fig. 6, the calculation process of rendering data such as the size, rotation, speed and the like of the particle is sunk into the GPU to be completed, and the rendering can be directly performed in the GPU after the calculation is completed. Compared with the related technology for calculating the rendering data in the CPU, the pure operation efficiency of the GPU is far higher than that of the CPU, and the calculation process module of the rendering data is sunk to the GPU, so that the generation efficiency of the particle special effect can be improved to a certain extent, and the operation efficiency of the particle system is improved.
Referring to fig. 7, an embodiment of the present application provides a device for generating a particle special effect, where the device includes:
the extracting unit 701 is configured to extract, from reference data of candidate particles, reference data of target particles, where the candidate particles are all particles on which special effects of each frame of particles are generated, the target particles are candidate particles that meet a selection condition at a first time, and the first time is a generation time of special effects of any frame of particles;
an obtaining unit 702, configured to obtain rendering data corresponding to the target particle at the first time based on the reference data of the target particle;
the generating unit 703 is configured to generate any frame of particle special effects based on rendering data corresponding to the target particle at the first time.
In one possible implementation, the reference data of the target particle includes initial rendering data corresponding to the generation time of the target particle by the target particle; the obtaining unit 702 is configured to calculate, based on the initial rendering data, rendering data corresponding to the target particle at the first time using rendering data calculation logic, where the rendering data calculation logic is configured to indicate an association relationship between the rendering data corresponding to the target particle at the first time and the initial rendering data.
In one possible implementation, the reference data of the target particle includes rendering data corresponding to each candidate time of the target particle in the life cycle of the target particle; an obtaining unit 702, configured to determine a target candidate time matched with the first time from the candidate times; and taking the rendering data corresponding to the target particles at the target candidate time as the rendering data corresponding to the target particles at the first time.
In one possible implementation, any frame particle effect is an initial frame particle effect, and the target particle is a candidate particle whose generation time matches the first time.
In one possible implementation, any frame particle effect is not an initial frame particle effect, the target particles include candidate particles corresponding at a first time that have been generated for a duration less than the duration of survival, and candidate particles whose generation times match the first time.
In one possible implementation, the apparatus further includes:
a determining unit, configured to determine validity of the candidate particle at a first time according to a life cycle of the candidate particle; candidate particles that are valid at the first time are taken as target particles.
In one possible implementation, the rendering data corresponding to the target particle at the first time includes at least one of size data, rotation data, speed data, position data, color data, and map data corresponding to the target particle at the first time.
In one possible implementation, the apparatus further includes:
the receiving unit is used for receiving the reference data of the candidate particles sent by the central processing unit in the computer equipment, the reference data of the candidate particles are obtained by transforming the particle data resources by the central processing unit, and the particle data resources are obtained by constructing in an off-line construction stage;
And a storage unit for storing the reference data of the candidate particles.
In one possible implementation manner, the offline construction stage comprises an effect editing stage and a resource construction stage, and when the effect editing stage meets the mode setting condition, the particle data resource is constructed in the resource construction stage; wherein satisfying the mode setting condition includes that the particle effect generation mode set in the effect editing stage is a mode for instructing generation of particle effects of each frame by the image processor.
According to the particle special effect generation device provided by the embodiment of the application, the image processor in the computer equipment stores the reference data of all particles according to which the particle special effect of each frame is generated, and the image processor can automatically acquire rendering data required by generating the particle special effect on the basis of the reference data of all the particles, so that the particle special effect is generated. The particle special effect generation process does not need the participation of a central processing unit, the image processor can automatically generate the particle special effect without waiting for the calculation of the central processing unit, the particle special effect generation efficiency is not limited by the calculation efficiency of the central processing unit, and the particle special effect generation efficiency is improved.
It should be noted that, when the apparatus provided in the foregoing embodiment performs the functions thereof, only the division of the functional units is used as an example, and in practical application, the functional allocation may be performed by different functional units according to needs, that is, the internal structure of the device is divided into different functional units, so as to perform all or part of the functions described above. In addition, the apparatus and the method embodiments provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the apparatus and the method embodiments are detailed in the method embodiments and are not repeated herein.
In an exemplary embodiment, a computer device is also provided, the computer device comprising a processor and a memory, the memory having at least one computer program stored therein. The at least one computer program is loaded and executed by one or more processors to cause the computer arrangement to implement the method of generating particle effects described above. The processor includes a central processor and an image processor, and the method for generating the particle special effects by the computer device refers to the method for generating the particle special effects by the image processor in the computer device.
Fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer device may be a terminal, for example: PC, cell phone, smart phone, PDA, wearable device, PPC, tablet computer, smart car machine, intelligent voice interaction device, intelligent household appliances (e.g. smart TV, smart speaker, etc.), aircraft, vehicle-mounted terminal, VR device, AR device. Terminals may also be referred to by other names as user equipment, portable terminals, laptop terminals, desktop terminals, etc.
Generally, the terminal includes: a processor 1501 and a memory 1502.
The processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1501 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU, and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU for use in responsible for rendering and rendering of content to be displayed by the display screen. In some embodiments, the processor 1501 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1502 may include one or more computer-readable storage media, which may be non-transitory. Memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1502 is configured to store at least one instruction for execution by the processor 1501 to cause the terminal to implement the method for generating particle effects provided by the method embodiments in the present application. The method for generating the particle special effect provided by the method embodiment of the application by the terminal is implemented by the image processor in the terminal.
In some embodiments, the terminal may further optionally include: a peripheral interface 1503 and at least one peripheral device. The processor 1501, memory 1502 and peripheral interface 1503 may be connected by a bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1503 via a bus, signal lines, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1504, a display 1505, a camera assembly 1506, audio circuitry 1507, and a power supply 1508.
A peripheral interface 1503 may be used to connect I/O (Input/Output) related at least one peripheral device to the processor 1501 and the memory 1502. In some embodiments, processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1504 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, etc. The radio frequency circuit 1504 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication, short range wireless communication) related circuits, which are not limited in this application.
Display 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When display screen 1505 is a touch display screen, display screen 1505 also has the ability to collect touch signals at or above the surface of display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. At this point, display 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1505 may be one, disposed on the front panel of the terminal; in other embodiments, the display 1505 may be at least two, respectively disposed on different surfaces of the terminal or in a folded design; in other embodiments, the display 1505 may be a flexible display disposed on a curved surface or a folded surface of the terminal. Even more, the display 1505 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display screen 1505 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 1506 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuitry 1507 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 1501 for processing, or inputting the electric signals to the radio frequency circuit 1504 for voice communication. For the purpose of stereo acquisition or noise reduction, a plurality of microphones can be respectively arranged at different parts of the terminal. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 1507 may also include a headphone jack.
The power supply 1508 is used to power the various components in the terminal. The power source 1508 may be alternating current, direct current, disposable battery, or rechargeable battery. When the power source 1508 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal further includes one or more sensors 1509. The one or more sensors 1509 include, but are not limited to: an acceleration sensor 1510, a gyro sensor 1511, a pressure sensor 1512, an optical sensor 1513, and a proximity sensor 1514.
The acceleration sensor 1510 may detect the magnitudes of accelerations on three coordinate axes of a coordinate system established with a terminal. For example, the acceleration sensor 1510 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1501 may control the display screen 1505 to display the user interface in either a landscape view or a portrait view based on the gravitational acceleration signal collected by the acceleration sensor 1510. The acceleration sensor 1510 may also be used for acquisition of motion data of a game or user.
The gyro sensor 1511 may detect a body direction and a rotation angle of the terminal, and the gyro sensor 1511 may collect a 3D motion of the user to the terminal in cooperation with the acceleration sensor 1510. The processor 1501, based on the data collected by the gyro sensor 1511, may implement the following functions: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 1512 may be disposed on a side frame of the terminal and/or below the display 1505. When the pressure sensor 1512 is disposed on a side frame of the terminal, a grip signal of the terminal by the user may be detected, and the processor 1501 performs a left-right hand recognition or a quick operation according to the grip signal collected by the pressure sensor 1512. When the pressure sensor 1512 is disposed at the lower layer of the display screen 1505, the processor 1501 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1505. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 1513 is used to collect the ambient light intensity. In one embodiment, processor 1501 may control the display brightness of display screen 1505 based on the intensity of ambient light collected by optical sensor 1513. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1505 is turned up; when the ambient light intensity is low, the display luminance of the display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1513.
A proximity sensor 1514, also referred to as a distance sensor, is typically provided on the front panel of the terminal. The proximity sensor 1514 is used to collect the distance between the user and the front face of the terminal. In one embodiment, when the proximity sensor 1514 detects a gradual decrease in the distance between the user and the front face of the terminal, the processor 1501 controls the display 1505 to switch from the on-screen state to the off-screen state; when the proximity sensor 1514 detects that the distance between the user and the front face of the terminal gradually increases, the processor 1501 controls the display screen 1505 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 8 is not limiting of the terminal and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein at least one computer program loaded and executed by a processor of a computer device to cause the computer to implement the method for generating particle effects described above.
In one possible implementation, the computer readable storage medium may be a Read-Only Memory (ROM), a random-access Memory (Random Access Memory, RAM), a compact disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, which comprises a computer program or computer instructions loaded and executed by a processor to cause the computer to implement the method for generating particle effects described above.
It should be noted that, information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals referred to in this application are all authorized by the user or are fully authorized by the parties, and the collection, use, and processing of relevant data is required to comply with relevant laws and regulations and standards of relevant countries and regions. For example, reference data, first time, particle special effects, etc. of candidate particles referred to in this application are all acquired with sufficient authorization.
It should be understood that references herein to "a plurality" are to two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The foregoing description of the exemplary embodiments of the present application is not intended to limit the invention to the particular embodiments of the present application, but to limit the scope of the invention to any modification, equivalents, or improvements made within the principles of the present application.

Claims (13)

1. A method of generating particle effects, the method being performed by an image processor in a computer device, the image processor having stored therein reference data for candidate particles, the method comprising:
extracting reference data of target particles from the reference data of the candidate particles, wherein the candidate particles are all particles on which special effects of each frame of particles are generated, the target particles are candidate particles meeting selection conditions at first time, and the first time is the generation time of special effects of any frame of particles;
based on the reference data of the target particles, rendering data corresponding to the target particles at the first time is obtained; and generating the particle special effect of any frame based on the rendering data corresponding to the target particle at the first time.
2. The method of claim 1, wherein the reference data of the target particle comprises initial rendering data corresponding to a generation time of the target particle by the target particle; the obtaining, based on the reference data of the target particle, rendering data corresponding to the target particle at the first time includes:
Based on the initial rendering data, rendering data corresponding to the target particles at the first time are calculated by using rendering data calculation logic, and the rendering data calculation logic is used for indicating the association relation between the rendering data corresponding to the target particles at the first time and the initial rendering data.
3. The method of claim 1, wherein the reference data of the target particle comprises rendering data for each candidate time of the target particle in a life cycle of the target particle; the obtaining, based on the reference data of the target particle, rendering data corresponding to the target particle at the first time includes:
determining a target candidate time matched with the first time in the candidate times;
and taking rendering data corresponding to the target particles at the target candidate time as rendering data corresponding to the target particles at the first time.
4. A method according to any one of claims 1-3, wherein the any one frame particle effect is an initial frame particle effect, and the target particles are candidate particles whose generation time matches the first time.
5. A method according to any of claims 1-3, wherein the any frame particle effect is not an initial frame particle effect, the target particles comprise candidate particles having a generated duration less than a lifetime corresponding to the first time, and candidate particles having a generation time matching the first time.
6. A method according to any one of claims 1-3, wherein prior to extracting the reference data of the target particle from the reference data of the candidate particle, the method further comprises:
determining the effectiveness of the candidate particles at the first time according to the life cycle of the candidate particles; candidate particles that are valid at the first time are taken as the target particles.
7. A method according to any of claims 1-3, wherein the rendering data corresponding to the target particle at the first time comprises at least one of size data, rotation data, speed data, position data, color data, and map data corresponding to the target particle at the first time.
8. A method according to any one of claims 1-3, wherein the method further comprises:
receiving reference data of the candidate particles sent by a central processing unit in the computer equipment, wherein the reference data of the candidate particles are obtained by transforming particle data resources by the central processing unit, and the particle data resources are obtained by constructing in an offline construction stage;
And storing the reference data of the candidate particles.
9. The method according to claim 8, wherein the offline construction stage comprises an effect editing stage and a resource construction stage, and the particle data resource is constructed in the resource construction stage when the effect editing stage satisfies a mode setting condition;
wherein the satisfaction mode setting condition includes that the particle special effect generation mode set in the effect editing stage is a mode for instructing generation of the particle special effect of each frame by the image processor.
10. A particle effect generation apparatus, the apparatus comprising:
the extraction unit is used for extracting reference data of target particles from the reference data of candidate particles, wherein the candidate particles are all particles on which special effects of particles of each frame are generated, the target particles are candidate particles meeting selection conditions at first time, and the first time is the generation time of special effects of particles of any frame;
the acquisition unit is used for acquiring rendering data corresponding to the target particles at the first time based on the reference data of the target particles;
and the generating unit is used for generating the particle special effect of any frame based on the rendering data corresponding to the target particle at the first time.
11. Computer device, characterized in that it comprises a processor and a memory in which at least one computer program is stored, which is loaded and executed by the processor, in order to carry out the method for generating particle effects according to any of claims 1 to 9.
12. A computer-readable storage medium, wherein at least one computer program is stored in the computer-readable storage medium, and the at least one computer program is loaded and executed by a processor, so that a computer implements the method for generating particle effects according to any one of claims 1 to 9.
13. A computer program product, characterized in that it comprises a computer program or computer instructions that are loaded and executed by a processor to cause the computer to implement the method of generating particle effects according to any of claims 1 to 9.
CN202210769963.2A 2022-06-30 2022-06-30 Particle special effect generation method, device, equipment and storage medium Pending CN117372590A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210769963.2A CN117372590A (en) 2022-06-30 2022-06-30 Particle special effect generation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210769963.2A CN117372590A (en) 2022-06-30 2022-06-30 Particle special effect generation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117372590A true CN117372590A (en) 2024-01-09

Family

ID=89398935

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210769963.2A Pending CN117372590A (en) 2022-06-30 2022-06-30 Particle special effect generation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117372590A (en)

Similar Documents

Publication Publication Date Title
CN110147231B (en) Combined special effect generation method and device and storage medium
US11393154B2 (en) Hair rendering method, device, electronic apparatus, and storage medium
CN109091869B (en) Method and device for controlling action of virtual object, computer equipment and storage medium
CN111701238A (en) Virtual picture volume display method, device, equipment and storage medium
KR20210113333A (en) Methods, devices, devices and storage media for controlling multiple virtual characters
CN108525298A (en) Image processing method, device, storage medium and electronic equipment
CN109754454A (en) Rendering method, device, storage medium and the equipment of object model
CN111603771B (en) Animation generation method, device, equipment and medium
CN110033503B (en) Animation display method and device, computer equipment and storage medium
CN112287852B (en) Face image processing method, face image display method, face image processing device and face image display equipment
CN108694073A (en) Control method, device, equipment and the storage medium of virtual scene
US20230072762A1 (en) Method and apparatus for displaying position mark, device, and storage medium
CN110662105A (en) Animation file generation method and device and storage medium
CN111028566A (en) Live broadcast teaching method, device, terminal and storage medium
CN111437600A (en) Plot showing method, plot showing device, plot showing equipment and storage medium
CN112306332B (en) Method, device and equipment for determining selected target and storage medium
CN112612387B (en) Method, device and equipment for displaying information and storage medium
CN116672706B (en) Illumination rendering method, device, terminal and storage medium
CN113487662A (en) Picture display method and device, electronic equipment and storage medium
CN112950753A (en) Virtual plant display method, device, equipment and storage medium
CN113436346A (en) Distance measuring method and device in three-dimensional space and storage medium
CN115861577A (en) Method, device and equipment for editing posture of virtual field scene and storage medium
CN113018865B (en) Climbing line generation method and device, computer equipment and storage medium
CN117372590A (en) Particle special effect generation method, device, equipment and storage medium
CN110300275A (en) Video record, playback method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination