CN117671097A - Model rendering method and device and electronic equipment - Google Patents
Model rendering method and device and electronic equipment Download PDFInfo
- Publication number
- CN117671097A CN117671097A CN202311448941.7A CN202311448941A CN117671097A CN 117671097 A CN117671097 A CN 117671097A CN 202311448941 A CN202311448941 A CN 202311448941A CN 117671097 A CN117671097 A CN 117671097A
- Authority
- CN
- China
- Prior art keywords
- flame
- texture
- model
- parameter
- noise
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 80
- 238000009877 rendering Methods 0.000 title claims abstract description 56
- 230000000694 effects Effects 0.000 claims abstract description 106
- 230000008859 change Effects 0.000 claims abstract description 102
- 238000013507 mapping Methods 0.000 claims abstract description 41
- 230000006870 function Effects 0.000 claims description 92
- 230000000737 periodic effect Effects 0.000 claims description 42
- 239000013598 vector Substances 0.000 claims description 28
- 238000012545 processing Methods 0.000 claims description 26
- 238000005070 sampling Methods 0.000 claims description 25
- 230000001815 facial effect Effects 0.000 claims description 4
- 230000009191 jumping Effects 0.000 abstract description 8
- 230000008569 process Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 239000002245 particle Substances 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 238000007664 blowing Methods 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000005286 illumination Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000033764 rhythmic process Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000802 evaporation-induced self-assembly Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/60—3D [Three Dimensional] animation of natural phenomena, e.g. rain, snow, water or plants
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The disclosure provides a model rendering method, a model rendering device and electronic equipment, wherein preset target models and flame effect parameters are obtained; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time; rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect. In the mode, the target model is rendered through the flame shape parameter and the brightness change parameter which change along with time, so that the rendered target model displays a more realistic jumping and flickering flame effect, and the generation efficiency of the flame effect is improved.
Description
Technical Field
The disclosure relates to the technical field of animation special effects, in particular to a model rendering method, a device and electronic equipment.
Background
In the virtual scene, the flame is taken as a luminous source, is a good route guidance scene prop, and can also enrich and promote the display effect of the picture. In the related art, a series of flame sequence frame pictures can be sampled to play frame by frame, so that a special effect flame effect is simulated; sub-particles can also be emitted through the particle system, and the generated effect is converted into flame texture, the display effect of the simulated flame and the like. However, the above manner is inefficient in generating the flame effect and cumbersome in the process.
Disclosure of Invention
Accordingly, an object of the present disclosure is to provide a method and apparatus for rendering a model, and an electronic device, so as to improve the rendering efficiency of the model.
In a first aspect, an embodiment of the present disclosure provides a model rendering method, including: acquiring a preset target model and flame effect parameters; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time; determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time; rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect.
In a second aspect, an embodiment of the present disclosure provides a model rendering apparatus, including: the model acquisition model is used for acquiring a preset target model and flame effect parameters; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time; the shape parameter determining module is used for determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time; and the model rendering module is used for rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including a processor and a memory, where the memory stores machine executable instructions executable by the processor, the processor executing the machine executable instructions to implement the model rendering method described above.
In a fourth aspect, the disclosed embodiments provide a machine-readable storage medium storing machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement the model rendering method described above.
The embodiment of the disclosure brings the following beneficial effects:
according to the model rendering method, the model rendering device and the electronic equipment, the preset target model and flame effect parameters are obtained; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time; determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time; rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect. In the mode, the target model is rendered through the flame shape parameter and the brightness change parameter which change along with time, so that the rendered target model displays a more realistic jumping and flickering flame effect, and the generation efficiency of the flame effect is improved.
Additional features and advantages of the disclosure will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the disclosure. The objectives and other advantages of the disclosure will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a model rendering method provided in an embodiment of the present disclosure;
FIG. 2 is a schematic illustration of a mask image of a flame ensemble provided in an embodiment of the present disclosure;
FIG. 3 is a schematic illustration of a mask image of a flame envelope provided in an embodiment of the present disclosure;
FIG. 4 is a schematic illustration of a mask image of a flame inner flame provided in an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a target rotation parameter of a dough model according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a dough model provided by an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a texture image of a patch model according to an embodiment of the present disclosure;
FIG. 8 is a schematic illustration of a flame effect provided by an embodiment of the present disclosure;
FIG. 9 is a schematic illustration of another flame effect provided by an embodiment of the present disclosure;
FIG. 10 is a schematic illustration of another flame effect provided by an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a model rendering device according to an embodiment of the disclosure;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the present disclosure will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present disclosure, but not all embodiments. Based on the embodiments in this disclosure, all other embodiments that a person skilled in the art would obtain without making any inventive effort are within the scope of protection of this disclosure.
In the virtual scene, flame is taken as a luminous source, is a good route guidance scene prop, can enrich and improve the picture effect, and is an indispensable element for middle century subjects, indoor, night, underground copy scene checkpoints and the like, and flame, candle fire, bonfire and the like.
In the related art, the mode of manufacturing the flame special effect can be to sample a series of flame sequence frame pictures, and play the flame sequence frame pictures frame by frame so as to achieve the effect of simulating the flame special effect. However, this method requires a sequence diagram of the flame generation process to be created and combined in one texture map, which takes a long time. The particles may also be emitted by a particle system, converting the display process of the particles into a flame texture, thereby simulating a flame effect. However, because of the large consumption of the particle system, the process of making the conventional special effect flame is difficult to be applied to a large-scale flame special effect scene. In addition, special effects can be produced in third-party software, and then the special effects are imported into a game engine to display flame effects. For the non-special link, the method has the advantages of complex process, high replacement difficulty, high learning cost and weak replacement.
Based on the above, the embodiment of the disclosure provides a model rendering method, device and electronic equipment, and the technology can be applied to various model rendering processes.
In one possible implementation, the disclosed embodiments provide a model rendering method.
As shown in fig. 1, the method comprises the steps of:
step S102, acquiring a preset target model and flame effect parameters; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time.
The object model is usually a three-dimensional model in a virtual scene, and may specifically be a combination of a patch model and a point light source model. The patch model is a three-dimensional model and is formed by connecting graphic elements. The transition between the graphic elements in the patch model is smoother, so that a smoother curved surface can be generated, and the curved surface can also be a plane. Since the surface of the flame can be regarded as a curved surface, the dough model can be set to be curved. In order to simulate the illumination effect of the flame on the ground, a point light source model can be arranged in the target model besides the surface patch model. The point light source model is usually arranged at the bottom of the dough sheet model. Wherein the point light source model can be regarded as a point element in a target model.
The flame in the real world runs off with time, and the disturbance of the surrounding air usually shows the effects of flame flickering, swinging with wind and the like. In order to simulate the actual flame display effect, the flame effect parameters generally need to be changed along with time, and the target model is rendered through the flame effect parameters, so that different colors are displayed at the same position of the target model at different times, and the flame flickering and jumping effect can be generated as a whole.
The flame effect parameters generally include flame texture mapping, noise ripple mapping corresponding to the patch model, and brightness variation parameters corresponding to the point light source model. Wherein the flame texture map is used to indicate the basic shape of the flame. The flame typically has an outer flame and an inner flame. The flame texture map may have portions therein corresponding to the outer flame and the inner flame, respectively. The noise ripple map may be generated based on the noise texture. Noise texture is one of the techniques commonly used in graphics to generate texture effects with nature, randomness, and diversity. Noise refers to a signal of a random nature. Noise texture is the texture effect generated by processing and sampling noise. Noise can be used to simulate various effects in nature.
The brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time so as to simulate the flickering effect of the flame center. In general, the parameter value of luminance may be expressed as a function of the variation of some value over time period. The flame generally does not appear in a dark state although the center of the flame suddenly and suddenly flickers, and thus the luminance change parameter generally has a certain range.
Step S104, determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used to indicate the change in flame shape over time.
In order to display a more realistic flame effect, the flame shape needs to change over time, which can be regarded as random, but the shape of the flame needs to be maintained as a whole. Noise texture parameters may be generated by noise ripple mapping, adding time-varying noise to the basic flame shape displayed by the flame texture mapping to generate flame shape parameters that may indicate the change in flame shape over time.
The noise ripple reason map can be one or more, and when including a plurality of, different noise ripple reason maps have different noise scales, overlap the noise of different scales, and the noise texture parameter of production is more changeable, overlaps with flame texture map back, and the change effect of the flame shape that produces is more natural. After the noise is superimposed, a velocity parameter of the flow over time needs to be added to the noise texture parameter so that the flame shape is time-varying. The noise texture parameters can be further processed according to the influence factors such as the current wind direction and the wind power, so that the flame shape changes along with the wind direction and the wind power.
The noise texture parameter generally corresponds to the basic shape of the flame. The part of the flame texture map, which represents the basic shape of the flame, is overlapped with the corresponding noise texture parameters, so that the texture coordinate parameter parameters are offset, the basic shape of the flame is deformed, and the effect that the flame jumps along with time is generated.
And step S106, rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect.
When rendering the target model, the corresponding brightness variation parameter and flame shape parameter are generally determined according to the current time. Further, the point light source model is rendered through the brightness change parameters, and the target model is rendered through the flame shape parameters, so that the target model with the flame special effect is obtained. The flame display effect of the rendered object model is time-varying. When the method is specifically implemented, a target model with jumping flame effect can be shot by a virtual camera, a frame-by-frame video frame is generated, and the video frame is continuously played to form the flame effect animation.
According to the model rendering method, the preset target model and flame effect parameters are obtained; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time; determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time; rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect. In the mode, the target model is rendered through the flame shape parameter and the brightness change parameter which change along with time, so that the rendered target model displays a more realistic jumping and flickering flame effect, and the generation efficiency of the flame effect is improved.
The following embodiments provide a specific way to determine flame shape parameters based on flame texture mapping and noise ripple mapping.
Firstly, determining noise texture parameters based on a noise ripple map; the noise texture parameter is used for indicating the change condition of the offset value of the corresponding texture coordinate along with time; flame shape parameters are then determined based on the noise texture parameters and the flame texture map.
The noise ripple map is used for storing the noise ripple, the stored noise wave texture can be read from the noise ripple map, the noise wave texture can be directly determined as the first noise wave texture, or the noise ripple map is subjected to superposition, offset processing and the like, and the noise ripple map obtained through processing is used as the first noise wave texture. The noise ripple map typically has three channel images, an R channel image, a G channel image, and a B channel image, respectively. One is called a first channel image, the other is called a second channel image, and the sizes of the noise waves corresponding to the first channel image and the second channel image are different. The noise ripple corresponding to the first channel image and the second channel image can be overlapped, and the overlapped noise ripple is determined to be the first noise wave texture.
Since the flame generally fluctuates in a certain direction, and the process of the fluctuation is continuous, the first noise ripple needs to be tiled based on the direction. For example, the direction of the flame fluctuation can be predetermined and represented by a first vector. Then, based on the direction corresponding to the first vector, tiling the first noise ripple theory to obtain a tiled first noise ripple theory; and establishing a change relation of texture coordinates corresponding to the first noise ripple theory along the texture direction corresponding to the patch model with the length of the second vector as the fluctuation speed of flame and the length of the second vector as the change speed. The texture direction corresponding to the dough sheet model is usually set from bottom to top by referring to the fluctuation direction of the flame.
In order to adapt the noise texture parameters to the basic flame shape, the noise texture parameters need to be determined based on the first noise texture of the basic flame shape and the corresponding texture coordinate variation. In a specific implementation, the third channel image of the noise texture map may be set to a first mask image corresponding to the basic shape of the flame, as shown in fig. 2. And further, the first noise texture can be processed based on the first mask image to obtain noise texture parameters.
Because the flame is possibly influenced by wind power, manpower and the like while fluctuating, a periodic function corresponding to a preset wind power or manpower related offset parameter is established for example; the periodic function may be generated by a process of superposition, inversion, etc. of the trigonometric function. And then sampling the noise texture parameters in the same direction as the texture coordinates of the patch model based on the periodic function to obtain the noise texture parameters after sampling.
In the real world, the fire source is usually located at the bottom of the flame, the bottom of the flame shape is more stable than the upper part, in order to reduce deformation generated at the bottom, a mask can be built for a part of the image corresponding to the bottom of the flame basic shape in the texture map storing noise texture parameters, so as to sample the noise texture parameters outside the noise texture parameters corresponding to the part of the image based on a periodic function, and the sampling process is performed in the same direction as the texture coordinates of the patch model. The method for creating the mask may be to counteract the fluctuation of the noise in the part by the noise texture parameter by creating a fluctuation parameter opposite to the fluctuation effect of the noise texture parameter in the part.
In order to represent the flame effect, the flame color is indispensable, so that the flame color texture parameter can be determined based on the flame texture map and a preset flame color parameter.
The flame texture map generally has three channel images, namely an R channel image, a G channel image and a B channel image. One of the channel images may be set as a mask image corresponding to the shape of the flame outer flame as shown in fig. 3. The other channel image is set as a mask image corresponding to the shape of the flame inner flame as shown in fig. 4. The last channel image may be omitted or set as a mask image corresponding to the overall shape of the flame, as shown in fig. 2. And may be specifically set according to the needs, and is not limited herein.
The outer flame color and the inner flame color of the flame are typically different. Thus, the flame color parameters also need to include the outer flame color parameters and the inner flame color parameters. The outer flame color parameter is typically red, the inner flame color may be orange, the inner flame may also include a flame core, and the flame core is typically blue in color.
When determining the flame color texture parameters, determining the flame color texture parameters based on the mask image corresponding to the inner flame and the inner flame color parameters, namely determining that the rendering color of the model area displaying the inner flame in the target model corresponds to the inner flame color; and determining the outer flame color texture parameter based on the mask image corresponding to the outer flame, the outer flame color parameter and the preset transparency parameter, namely determining that the rendering color of the model area for displaying the outer flame in the target model corresponds to the outer flame color. Since the outer flame generally has a certain transparency, the display color of the model area can be set to be translucent by a preset transparency parameter.
For the point light source model, the brightness variation parameters are determined by the following ways: generating a target periodic function based on a preset first trigonometric function and a preset second trigonometric function; the function periods of the first trigonometric function and the second trigonometric function are different; the trigonometric function can be a sine function, a cosine function and the like, and can be specifically set according to requirements. Since the brightness is positive, the function values of the objective periodic function are positive. The flame suddenly changes, but the brightness is always higher than a certain value, so that the brightness range can be preset, and the function parameters of the target periodic function are adjusted based on the brightness range, so that the adjusted target periodic function is obtained; the value range of the function value of the adjusted target periodic function is matched with the brightness range; and determining a brightness change parameter based on the adjusted target periodic function. Specifically, the sampling time may be set, and then the function value of the target periodic function corresponding to the sampling time is read, and the function value is determined as the luminance change parameter.
Since the display position of the flame in the virtual scene is generally fixed, the target model only needs to display the flame effect on the model surface facing the virtual camera when the virtual camera photographs itself, and the display effect of the model surface that the virtual camera cannot photograph may not be considered.
When a virtual camera is set in a virtual scene, the orientation of a target model in the target virtual scene is required to be adjusted based on the preset pose parameters of the virtual camera so as to enable the target model to be oriented to the virtual camera; the virtual camera is used for shooting a target virtual scene.
The virtual camera is a virtual camera arranged in a target virtual scene, and can be used for simulating a third person visual angle and also can be used for simulating a player visual angle. When the virtual camera shoots a target virtual scene, a target model may be shot, or the target model may not be shot. When the virtual camera shoots a target model, a model surface in the target model that renders a flame display effect needs to be oriented towards the virtual camera.
The above-mentioned pose parameters generally include position parameters and pose parameters. Both the position and the posture of the virtual camera in the target virtual scene affect the scene shooting range of the virtual camera. After determining the scene shooting range of the mother-school virtual camera, it may be determined that the virtual camera can shoot the target model based on the position of the target model in the virtual scene. If so, controlling a model surface of the target model that generates the flame display effect toward the virtual camera.
Specifically, in particular implementations, the object model may include a patch model. The virtual camera is usually set in the target virtual scene in advance, and the setting parameters of the virtual camera in the target virtual scene can be set, wherein the setting parameters comprise pose parameters of the virtual camera. Wherein the pose parameters include position parameters, which may be referred to as first position parameters. Meanwhile, a second position parameter of the patch model in the target virtual scene is required to be acquired. Then, based on the first position parameter and the second position parameter, a direction vector of the patch model towards the virtual camera is determined, as shown in fig. 5, and l represents the direction vector. Then, determining a target rotation parameter of the patch model based on the direction vector and the current orientation of the patch model; as shown in fig. 9, α is a target rotation angle, and the orientation of the patch model in the target virtual scene may be further adjusted based on the target rotation parameter.
When a virtual camera shoots a target virtual scene, the pose of the virtual camera is usually changed to shoot different scene ranges, and when a virtual player corresponding to a visual angle simulated by the virtual camera moves, the position of the virtual camera also changes. Whether the pose or the position of the virtual camera is changed, the change of the shooting scene range is brought about. In order to omit the step of judging that the target model can be shot by the virtual camera, the target model can be always controlled to face the virtual camera, so that the situation that the flame effect is incomplete or no flame effect exists when the virtual camera shoots the target model is prevented.
If the pose parameters of the virtual camera change, such as the virtual camera moves to other scene positions or the lens of the virtual camera rotates, the orientation of the target model in the target virtual scene can be updated based on the changed position parameters, so that the updated target model is oriented to the virtual camera.
The embodiment of the disclosure also provides another model rendering method. The method is implemented on the basis of the method shown in fig. 1. The method is mainly used for solving the following problems:
1, a sequence chart of the flame generating process needs to be manufactured and combined in a texture chart, so that the time consumption is long.
2, the particle system consumes a large amount, is difficult to use on a large scale due to the conventional special effect flame production, and is cautious to use.
And 3, for non-special links, the process is complicated, the replacement difficulty is high, the learning cost is high, and the replacement performance is weak.
The method is mainly realized based on the following principle: randomly generating noise wave patterns by using Photoshop or Substance 3DDesigner, translating and adding at different speeds, generating periodic fluctuation by combining simulated wind blowing, and giving a flame texture pattern with a radian on an upright model sheet by sampling to influence layering so as to realize flame simulation. And a material function is manufactured, the flame orientation is controlled, and the flame always follows a camera or a player lens; and then controlling random change of the light source by using an illumination function to realize dynamic fluctuation of the light shadow, and packaging the light shadow into blueprint assets for use.
The method mainly comprises the following steps:
1 use Photoshop to create a flame pattern channel map (corresponding to the flame effect map) and a noise wave map (corresponding to the "noise ripple map")
The channel diagram consists of three channels of red, green and blue, is convenient for subsequent channel extraction and color modification, and images displayed by the three channels are respectively shown in fig. 2, 3 and 4.
The noise diagram is also formed by combining red, green and blue three channels. Wherein, the red (R) channel and the green (G) channel are two black and white gray texture patterns with different depths, and the blue (B) channel is reserved for manufacturing a black Bai Meng layout (also called as a mask pattern) which approximates flame textures.
The channel map and the noise map are led into the engine, the texture setting is modified, and the compression is set to be BC7 (DX 11), so that the resolution of the texture can be improved.
An arcuate patch was then created using modeling software such as 3DMax, as shown in fig. 6. The texture coordinates (UV) of the patch are arranged from bottom to top as shown in fig. 7.
2, writing and manufacturing controllable disturbance nodes and simulating the effect of wind blowing fluctuation.
(1) First realizing upward movement trend of texture
Calling the Noise texture, calling the TexCoord node to multiply with a vector (named as Noise A) to control the tiling times; using a translation node Panner to link a positive number vector to speed to control the upward movement and the speed of movement of the texture; copying the above steps into one part, controlling the vector of the tiling times (named as Noise B), adjusting the copied speed into a smaller numerical value, connecting the numerical value to speed, extracting R (red) channels and G (green) channels respectively by two Noise waves, and adding; resulting in a more variable upward disturbance.
(2) And (3) using a B (blue) channel of the noise wave diagram as a mask to remove useless texture parameters on the periphery, and connecting flame textures.
(3) Simulation of the effects of gust fluctuation
A Sine function (whose value varies back and forth from-1 to 1 over Time) can be called, adding a vector of Time nodes Time and control Time speed to the function; then the mask (G) is used to identify the wave from bottom to top,
the period of the sine function is set to be 1.5, and then multiplied by 2, so that the red wave moving from bottom to top is wider, and the dynamic effect of the fluctuation of the flame caused by wind is more natural. The range of the simulated wind blowing is increased to control the Clamp node, the minimum wind power input is 0.5, and the maximum wind power input is 2 (the value can be changed according to the requirement).
(4) Subtracting the disturbance steps to obtain fluctuation disturbance from bottom to top
The current sub node is used to replace the multiple (0.1) node in the above step, so as to obtain the disturbance effect of the bottom graph from bottom to top, as shown in fig. 8.
And 3, repairing disturbance at the bottom of the flame.
Since the UV of the model is arranged from bottom to top, adding (1-x) reverse nodes by using mask (G) nodes again, adjusting to black below (equivalent to adding a mask for the bottom of the flame), and then adding power to strengthen the black area range; the disturbance in the flame bottom is removed by re-multiplying the disturbance range in the above process as a mask, as shown in fig. 9.
4 control of flame color
The red (R) and green (G) channels of flame textures are utilized to multiply the color vectors, and the vectors are changed into controllable parameters; the red channel is the mask of the inner flame and the green channel is the mask of the outer flame. Adding add together, outputting to the self-luminous node, changing the display effect of the blue-white channel into semitransparent, and inputting the blue channel of flame texture to the opaque node.
5 making a function of a following camera
(1) Firstly, obtaining the Position of a Camera, calling out the Position of a node Camera, and multiplying the information of X and Y of variables to obtain the horizontal Position information of the Camera; then, the Position information of the Object is identified, the node Object Position is called, and the camera Position is subtracted from the Object Position to obtain a direction vector of the Object towards the camera.
(2) Only the direction relationship between the object and the camera is needed, and the distance relationship between the object and the camera is eliminated
The node normal is connected, and the direction ratio is preserved so that the distance becomes always 1.
The resulting values were split into X (R) and Y (G), and the radian of rotation of the camera was calculated by arctagent 2 Arctangent nodes.
(3) Calling out a rotating shaft node of the rotating AboutAxis, wherein the interface rotating Angle only receives the value of 0-1, and the 0-1 of the rotating shaft is 0-360 degrees, so that the radian obtained by the arctagent 2 Arctangent node is required to be converted into the value of 0-1, and the radian is removed by 2 pi (2 x 3.14159.) to obtain the rotating shaft angle value (-0.25); adding 0.25 to obtain a correct rotation value of 0 to 0.5;
Then Position identifies the absolute world Position, normalzeRotationAnxis identifies the object Position as the rotation axis of the camera
The PivotPoint interface is to recognize the coordinates of the object as the rotation center all the time
The camera is rotated in this step, and the object can obtain a function that the rotation angle value follows the rotation, which can be named: mf_billbard. Then the function MF_BillBoard is connected to the world scene shift interface of the material loader, so that the effect of rotating the flame following camera is realized
6 adding fire light dithering
(1) Creating a material sphere, and changing the mode into a lighting function LightFunction. Two sine functions were used to drive the simulation, with the period of fluctuation set to 3 and 4, respectively. Converting a negative value of sine into a positive value by using an Abs node, reversing 1-x, multiplying, and simulating a flicker rhythm of inattention and inattention; the other sine is the same, 1-x is directly reversed, and then multiplication is carried out, so that the flicker rhythm of the inattention and the negligence is simulated; then the two results are added to add, but the flame flicker does not cause blackening to become no bright, so the range needs to be restricted, and the overall brightness is improved by 0.5 times.
(2) A point light source (corresponding to the "point light source model") was created and the function material (named m_lightfunction) was given to the point light source. The effect of the final merge is shown in fig. 10.
The method can replace the tedious problem of transferring a unified pile sequence diagram by using one channel diagram, can realize rapid and efficient art replacement, realizes low-cost diversity, simulates the artistic effect of suddenly-changing flame, is integrated with the luminous source package, and is convenient to use.
For the above method embodiment, referring to fig. 11, a model rendering apparatus includes:
a model acquisition model 1102, configured to acquire a preset target model and flame effect parameters; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time;
a shape parameter determination module 1104 for determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time;
the model rendering module 1106 is configured to render the target model based on the brightness variation parameter and the flame shape parameter, so as to obtain the target model with the flame special effect.
The model rendering device acquires a preset target model and flame effect parameters; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time; determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time; rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect. In the mode, the target model is rendered through the flame shape parameter and the brightness change parameter which change along with time, so that the rendered target model displays a more realistic jumping and flickering flame effect, and the generation efficiency of the flame effect is improved.
The shape parameter determining module is further configured to: determining noise texture parameters based on the noise ripple map; the noise texture parameter is used for indicating the change condition of the offset value of the corresponding texture coordinate along with time; based on the noise texture parameters and the flame texture map, flame shape parameters are determined.
The shape parameter determining module is further configured to: determining a first noise texture based on the noise ripple map; tiling the first noise ripple theory based on a direction corresponding to a preset first vector to obtain a tiled first noise ripple theory; taking the length of a preset second vector as the change speed, taking the texture direction corresponding to the patch model as the change direction, and establishing the change relation of the texture coordinates corresponding to the first noise ripple with time; a noise texture parameter is determined based on the first noise ripple theory of the flame base shape and the corresponding texture coordinate variation.
The noise ripple map comprises a first channel image and a second channel image; the sizes of the noise waves corresponding to the first channel image and the second channel image are different; the shape parameter determining module is further configured to: and superposing the noise ripple principles corresponding to the first channel image and the second channel image, and determining the superposed noise ripple principles as first noise wave textures.
The noise ripple map comprises a third channel image; the third channel image is a first mask image corresponding to the basic shape of the flame; the shape parameter determining module is further configured to: and processing the first noise texture based on the first mask image to obtain noise texture parameters.
The device comprises: the periodic function establishing module is used for establishing a periodic function corresponding to a preset offset parameter; and the sampling module is used for carrying out sampling processing on the noise texture parameters based on the periodic function, wherein the sampling processing is the same as the texture coordinate direction of the patch model, and the noise texture parameters after the sampling processing are obtained.
The sampling module is further used for: establishing a mask for a part of the image corresponding to the bottom of the basic shape of the flame in the texture map for storing the noise texture parameters; and carrying out sampling processing on noise texture parameters except the noise texture parameters corresponding to the partial images based on the periodic function, wherein the sampling processing is the same as the texture coordinate direction of the patch model.
The device further comprises: the color texture parameter determining module is used for determining flame color texture parameters based on the flame texture map and preset flame color parameters.
The flame texture map comprises a fourth channel image and a fifth channel image; the fourth image comprises a second mask image corresponding to the shape of the flame outer flame; the fifth channel image comprises a third mask image corresponding to the shape of the flame inner flame; the flame color parameters include an outer flame color parameter and an inner flame color parameter; the color texture parameter determination module is further configured to: determining an inner flame color texture parameter based on the second mask image and the inner flame color parameter; and determining the outer flame color texture parameter based on the third mask image, the outer flame color parameter and a preset transparency parameter.
The device further comprises a brightness variation parameter determining module for: generating a target periodic function based on a preset first trigonometric function and a preset second trigonometric function; the function periods of the first trigonometric function and the second trigonometric function are different; the function values of the target periodic function are positive numbers; based on a preset brightness range, adjusting function parameters of the target periodic function to obtain an adjusted target periodic function; the value range of the function value of the adjusted target periodic function is matched with the brightness range; and determining a brightness change parameter based on the adjusted target periodic function.
The device further comprises: the orientation adjustment module is used for adjusting the orientation of the target model in the target virtual scene based on the preset pose parameters of the virtual camera so as to enable the target model to face the virtual camera; the virtual camera is used for shooting a target virtual scene.
The pose parameters comprise position parameters; the orientation adjustment module is also for: acquiring a first position parameter of a virtual camera in a target virtual scene and a second position parameter of a facial model in the target virtual scene; determining a direction vector of the patch model towards the virtual camera based on the first position parameter and the second position parameter; determining a target rotation parameter of the patch model based on the direction vector and the current orientation of the patch model; based on the target rotation parameters, the orientation of the patch model in the target virtual scene is adjusted.
The device further comprises: and the orientation updating module is used for updating the orientation of the target model in the target virtual scene based on the changed position parameters if the pose parameters of the virtual camera are changed, so that the target model with the updated orientation is oriented to the virtual camera.
The present embodiment also provides an electronic device including a processor and a memory, the memory storing machine executable instructions executable by the processor, the processor executing the machine executable instructions to implement the above model rendering method, for example:
acquiring a preset target model and flame effect parameters; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time; determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time; rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect.
In the mode, the target model is rendered through the flame shape parameter and the brightness change parameter which change along with time, so that the rendered target model displays a more realistic jumping and flickering flame effect, and the generation efficiency of the flame effect is improved.
Optionally, the step of determining the flame shape parameter based on the flame texture map and the noise ripple map includes: determining noise texture parameters based on the noise ripple map; the noise texture parameter is used for indicating the change condition of the offset value of the corresponding texture coordinate along with time; based on the noise texture parameters and the flame texture map, flame shape parameters are determined.
Optionally, the step of determining the noise texture parameter based on the noise ripple map includes: determining a first noise texture based on the noise ripple map; tiling the first noise ripple theory based on a direction corresponding to a preset first vector to obtain a tiled first noise ripple theory; taking the length of a preset second vector as the change speed, taking the texture direction corresponding to the patch model as the change direction, and establishing the change relation of the texture coordinates corresponding to the first noise ripple with time; a noise texture parameter is determined based on the first noise ripple theory of the flame base shape and the corresponding texture coordinate variation.
Optionally, the noise ripple map includes a first channel image and a second channel image; the sizes of the noise waves corresponding to the first channel image and the second channel image are different; the step of determining a first noise texture based on the noise ripple map comprises: and superposing the noise ripple principles corresponding to the first channel image and the second channel image, and determining the superposed noise ripple principles as first noise wave textures.
Optionally, the noise ripple map includes a third channel image; the third channel image is a first mask image corresponding to the basic shape of the flame; determining noise texture parameters based on the first noise texture of the flame basic shape and corresponding texture coordinate variations, comprising: and processing the first noise texture based on the first mask image to obtain noise texture parameters.
Optionally, after determining the noise texture parameter, the method further includes: establishing a periodic function corresponding to a preset offset parameter; and carrying out sampling processing on the noise texture parameters based on the periodic function, wherein the sampling processing is the same as the texture coordinate direction of the patch model, and obtaining the noise texture parameters after the sampling processing.
Optionally, the step of performing sampling processing on the noise texture parameter based on the periodic function in the same direction as the texture coordinate of the patch model includes: establishing a mask for a part of the image corresponding to the bottom of the basic shape of the flame in the texture map for storing the noise texture parameters; and carrying out sampling processing on noise texture parameters except the noise texture parameters corresponding to the partial images based on the periodic function, wherein the sampling processing is the same as the texture coordinate direction of the patch model.
Optionally, the method further comprises: and determining flame color texture parameters based on the flame texture map and preset flame color parameters.
Optionally, the flame texture map includes a fourth channel image and a fifth channel image; the fourth image comprises a second mask image corresponding to the shape of the flame outer flame; the fifth channel image comprises a third mask image corresponding to the shape of the flame inner flame; the flame color parameters include an outer flame color parameter and an inner flame color parameter; based on the flame texture map and a preset flame color parameter, determining the flame color texture parameter comprises the following steps: determining an inner flame color texture parameter based on the second mask image and the inner flame color parameter; and determining the outer flame color texture parameter based on the third mask image, the outer flame color parameter and a preset transparency parameter.
Optionally, the above-mentioned brightness variation parameter is determined by: generating a target periodic function based on a preset first trigonometric function and a preset second trigonometric function; the function periods of the first trigonometric function and the second trigonometric function are different; the function values of the target periodic function are positive numbers; based on a preset brightness range, adjusting function parameters of the target periodic function to obtain an adjusted target periodic function; the value range of the function value of the adjusted target periodic function is matched with the brightness range; and determining a brightness change parameter based on the adjusted target periodic function.
Optionally, the method further comprises: based on preset pose parameters of the virtual camera, adjusting the orientation of the target model in the target virtual scene so as to enable the target model to face the virtual camera; the virtual camera is used for shooting a target virtual scene.
Optionally, the pose parameter includes a position parameter; based on the preset pose parameters of the virtual camera, adjusting the orientation of the target model in the target virtual scene, wherein the method comprises the following steps: acquiring a first position parameter of a virtual camera in a target virtual scene and a second position parameter of a facial model in the target virtual scene; determining a direction vector of the patch model towards the virtual camera based on the first position parameter and the second position parameter; determining a target rotation parameter of the patch model based on the direction vector and the current orientation of the patch model; based on the target rotation parameters, the orientation of the patch model in the target virtual scene is adjusted.
Optionally, the method further comprises: and if the pose parameters of the virtual camera are changed, updating the orientation of the target model in the target virtual scene based on the changed position parameters, so that the updated target model is oriented to the virtual camera.
Referring to fig. 12, the electronic device includes a processor 100 and a memory 101, the memory 101 storing machine executable instructions that can be executed by the processor 100, the processor 100 executing the machine executable instructions to implement the model rendering method described above.
Further, the electronic device shown in fig. 12 further includes a bus 102 and a communication interface 103, and the processor 100, the communication interface 103, and the memory 101 are connected through the bus 102.
The memory 101 may include a high-speed random access memory (RAM, random Access Memory), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory. The communication connection between the system network element and at least one other network element is implemented via at least one communication interface 103 (which may be wired or wireless), and may use the internet, a wide area network, a local network, a metropolitan area network, etc. Bus 102 may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in FIG. 12, but not only one bus or type of bus.
The processor 100 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 100 or by instructions in the form of software. The processor 100 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but also digital signal processors (Digital Signal Processor, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field-programmable gate arrays (Field-Programmable Gate Array, FPGA for short) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks of the disclosure in the embodiments of the disclosure may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present disclosure may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 101, and the processor 100 reads the information in the memory 101 and, in combination with its hardware, performs the steps of the method of the previous embodiment.
The present embodiment also provides a machine-readable storage medium storing machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement the model rendering method described above.
A model rendering method, apparatus and electronic device provided in the embodiments of the present disclosure include a computer readable storage medium storing program code, where the program code includes instructions for performing the method described in the foregoing method embodiments, for example:
acquiring a preset target model and flame effect parameters; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping, noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; flame texture mapping is used to indicate the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time; determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time; rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect.
In the mode, the target model is rendered through the flame shape parameter and the brightness change parameter which change along with time, so that the rendered target model displays a more realistic jumping and flickering flame effect, and the generation efficiency of the flame effect is improved.
Optionally, the target model includes a patch model; based on the preset pose parameters of the virtual camera, adjusting the orientation of the target model in the target virtual scene, wherein the method comprises the following steps: acquiring pose parameters of a virtual camera in a target virtual scene and first position parameters of a facial model in the target virtual scene; the pose parameters comprise second position parameters and pose parameters; determining a direction vector of the patch model towards the virtual camera based on the first position parameter and the second position parameter; determining a target rotation parameter of the patch model based on the attitude parameter and the direction vector; based on the target rotation parameters, the orientation of the patch model in the target virtual scene is adjusted.
Optionally, the method further comprises: and if the pose parameters of the virtual camera are changed, updating the orientation of the target model in the target virtual scene based on the changed position parameters, so that the updated target model is oriented to the virtual camera.
Optionally, the target model includes a patch model and a point light source model; the point light source model is arranged at the bottom of the dough sheet model.
Optionally, the target model includes a patch model and a point light source model; the flame effect parameters comprise texture rendering parameters corresponding to the surface patch model and brightness variation parameters corresponding to the point light source model; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time; rendering the adjusted target model based on the flame effect parameters, comprising: rendering the patch model based on the texture rendering parameters; and rendering the point light source model based on the brightness variation parameters.
Optionally, the texture rendering parameters are determined by: acquiring a preset flame texture map and a noise ripple map; flame texture mapping is used to indicate flame shape; determining flame color texture parameters based on the flame texture map and preset flame color parameters; texture rendering parameters are determined based on the flame color texture parameters and the noise ripple map.
Optionally, the step of determining the texture rendering parameter based on the flame color texture parameter and the noise ripple map includes: determining noise texture parameters based on the noise ripple map and preset noise movement parameters; the noise texture parameter is used for indicating flame flickering state; based on the noise texture parameter and the flame color texture parameter, texture rendering parameters are generated.
Optionally, the flame texture map includes a first channel image and a second channel image; the first channel image comprises a first mask image corresponding to the shape of the flame outer flame; the second channel image comprises a second mask image corresponding to the shape of the flame inner flame; the flame color parameters include an outer flame color parameter and an inner flame color parameter; based on the flame texture map and a preset flame color parameter, determining the flame color texture parameter comprises the following steps: determining an inner flame color texture parameter based on the first mask image and the inner flame color parameter; and determining the outer flame color texture parameter based on the second mask image, the outer flame color parameter and a preset transparency parameter.
Optionally, the noise ripple map includes a third channel image and a fourth channel image; the third channel image is used for displaying the first noise texture; the fourth image includes a third mask image corresponding to the flame shape; based on the noise ripple map and a preset noise movement parameter, determining a noise texture parameter comprises the following steps: determining an initial noise texture parameter based on the first noise texture and the noise movement parameter; processing the initial noise texture parameters based on the third mask image to obtain noise texture parameters; the noise texture parameter corresponds to a flame texture map.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again.
In addition, in the description of the embodiments of the present disclosure, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the terms in this disclosure will be understood by those skilled in the art in the specific case.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solutions, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the description of the present disclosure, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present disclosure and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present disclosure. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the foregoing examples are merely illustrative of specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, although the disclosure has been described in detail with reference to the foregoing examples, it will be understood by those skilled in the art that: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.
Claims (16)
1. A model rendering method, comprising:
acquiring a preset target model and flame effect parameters; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping and noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; the flame texture map is used for indicating the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time;
determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time;
and rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect.
2. The method of claim 1, wherein determining flame shape parameters based on the flame texture map and the noise ripple map comprises:
determining noise texture parameters based on the noise ripple map; the noise texture parameter is used for indicating the change condition of the offset value of the corresponding texture coordinate along with time;
A flame shape parameter is determined based on the noise texture parameter and the flame texture map.
3. The method of claim 2, wherein determining noise texture parameters based on the noise ripple map comprises:
determining a first noise texture based on the noise ripple map;
tiling the first noise ripple on the basis of a direction corresponding to a preset first vector to obtain a tiled first noise ripple;
taking the length of a preset second vector as the change speed, and taking the texture direction corresponding to the patch model as the change direction, and establishing the change relation of the texture coordinates corresponding to the first noise ripple with time;
and determining a noise texture parameter based on the flame basic shape and the first noise texture of the corresponding texture coordinate change.
4. The method of claim 3, wherein the noise ripple map comprises a first channel image and a second channel image; the sizes of the noise waves corresponding to the first channel image and the second channel image are different;
based on the noise ripple map, determining a first noise wave texture comprises the following steps:
and superposing the noise ripple corresponding to the first channel image and the second channel image, and determining the superposed noise ripple as a first noise ripple texture.
5. A method according to claim 3, wherein the noise ripple map comprises a third channel image; the third channel image is a first mask image corresponding to the basic shape of the flame;
determining noise texture parameters based on the first noise texture of the flame basic shape and the corresponding texture coordinate variation, comprising:
and processing the first noise texture based on the first mask image to obtain noise texture parameters.
6. The method of claim 3, wherein after determining noise texture parameters based on the first noise texture of the flame base shape and corresponding texture coordinate variations, the method further comprises:
establishing a periodic function corresponding to a preset offset parameter;
and carrying out sampling processing on the noise texture parameters based on the periodic function, wherein the sampling processing is the same as the texture coordinate direction of the patch model, and obtaining the noise texture parameters after the sampling processing.
7. The method according to claim 6, wherein the step of performing sampling processing on the noise texture parameter in the same direction as the texture coordinates of the patch model based on the periodic function comprises:
Establishing a mask for partial images corresponding to the bottom of the flame basic shape in the texture map for storing the noise texture parameters;
and carrying out sampling processing in the same direction as the texture coordinate of the patch model on noise texture parameters except the noise texture parameters corresponding to the partial image based on the periodic function.
8. The method according to claim 1, wherein the method further comprises:
and determining flame color texture parameters based on the flame texture map and preset flame color parameters.
9. The method of claim 8, wherein the flame texture map comprises a fourth channel image and a fifth channel image; the fourth channel image comprises a second mask image corresponding to the shape of the flame outer flame; the fifth channel image comprises a third mask image corresponding to the shape of the inner flame of the flame; the flame color parameters include an outer flame color parameter and an inner flame color parameter;
based on the flame texture map and a preset flame color parameter, determining the flame color texture parameter comprises the following steps:
determining an inner flame color texture parameter based on the second mask image and the inner flame color parameter;
And determining an outer flame color texture parameter based on the third mask image, the outer flame color parameter and a preset transparency parameter.
10. The method of claim 1, wherein the brightness variation parameter is determined by:
generating a target periodic function based on a preset first trigonometric function and a preset second trigonometric function; the function periods of the first trigonometric function and the second trigonometric function are different; the function values of the target periodic function are positive numbers;
based on a preset brightness range, adjusting function parameters of the target periodic function to obtain an adjusted target periodic function; the value range of the function value of the adjusted target periodic function is matched with the brightness range;
and determining the brightness change parameter based on the adjusted target periodic function.
11. The method according to claim 1, wherein the method further comprises:
based on preset pose parameters of a virtual camera, adjusting the orientation of the target model in a target virtual scene so as to enable the target model to be oriented to the virtual camera; the virtual camera is used for shooting the target virtual scene.
12. The method of claim 11, wherein the pose parameters include position parameters;
based on preset pose parameters of the virtual camera, adjusting the orientation of the target model in the target virtual scene, wherein the method comprises the following steps:
acquiring a first position parameter of a virtual camera in a target virtual scene and a second position parameter of the facial mask model in the target virtual scene;
determining a direction vector of the patch model towards the virtual camera based on the first position parameter and the second position parameter;
determining a target rotation parameter of the patch model based on the direction vector and a current orientation of the patch model;
and adjusting the orientation of the patch model in the target virtual scene based on the target rotation parameter.
13. The method of claim 11, wherein the method further comprises:
and if the pose parameters of the virtual camera are changed, updating the orientation of the target model in the target virtual scene based on the changed position parameters, so that the target model with updated orientation is oriented to the virtual camera.
14. A model rendering apparatus, characterized by comprising:
The model acquisition model is used for acquiring a preset target model and flame effect parameters; the target model comprises a surface patch model and a point light source model which have a specified position relation; the flame effect parameters comprise flame texture mapping and noise ripple mapping corresponding to the surface patch model and brightness change parameters corresponding to the point light source model; the flame texture map is used for indicating the basic shape of the flame; the brightness change parameter is used for indicating the change condition of the brightness of the point light source model along with time;
the shape parameter determining module is used for determining flame shape parameters based on the flame texture map and the noise ripple map; the flame shape parameter is used for indicating the change condition of the flame shape along with time;
and the model rendering module is used for rendering the target model based on the brightness change parameter and the flame shape parameter to obtain the target model with the flame special effect.
15. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor, the processor executing the machine executable instructions to implement the model rendering method of any one of claims 1-13.
16. A machine-readable storage medium storing machine-executable instructions which, when invoked and executed by a processor, cause the processor to implement the model rendering method of any one of claims 1-13.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311448941.7A CN117671097A (en) | 2023-10-31 | 2023-10-31 | Model rendering method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311448941.7A CN117671097A (en) | 2023-10-31 | 2023-10-31 | Model rendering method and device and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117671097A true CN117671097A (en) | 2024-03-08 |
Family
ID=90077904
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311448941.7A Pending CN117671097A (en) | 2023-10-31 | 2023-10-31 | Model rendering method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117671097A (en) |
-
2023
- 2023-10-31 CN CN202311448941.7A patent/CN117671097A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111968216B (en) | Volume cloud shadow rendering method and device, electronic equipment and storage medium | |
US7583264B2 (en) | Apparatus and program for image generation | |
CA2282637C (en) | Method for rendering shadows on a graphical display | |
CN111508052A (en) | Rendering method and device of three-dimensional grid body | |
AU2019226134B2 (en) | Environment map hole-filling | |
Eck | Introduction to Computer Graphics | |
US11276150B2 (en) | Environment map generation and hole filling | |
CN111476877A (en) | Shadow rendering method and device, electronic equipment and storage medium | |
Ganovelli et al. | Introduction to computer graphics: A practical learning approach | |
US7064753B2 (en) | Image generating method, storage medium, image generating apparatus, data signal and program | |
CN104899913B (en) | A kind of fluid special effect making method true to nature under virtual stage environment | |
CN115526976A (en) | Virtual scene rendering method and device, storage medium and electronic equipment | |
CN118397160A (en) | Autonomous three-dimensional rendering engine for reverse site building system of oil field site | |
Toisoul et al. | Accessible GLSL Shader Programming. | |
CN117671097A (en) | Model rendering method and device and electronic equipment | |
CN114241098A (en) | Cartoon dynamic effect manufacturing method and device | |
CN114392551A (en) | Display control method and device of virtual object and electronic equipment | |
CN112465941B (en) | Volume cloud processing method and device, electronic equipment and storage medium | |
CN114937103A (en) | Model rendering method and device for dynamic effect, electronic equipment and storage medium | |
CN114266855A (en) | Light effect simulation method and device of dot matrix screen and electronic equipment | |
CN114627214A (en) | Vertex animation processing method and device and electronic equipment | |
Yang et al. | Visual effects in computer games | |
CN116958332B (en) | Method and system for mapping 3D model in real time of paper drawing based on image recognition | |
JP4436101B2 (en) | robot | |
CN116091678A (en) | Smoke effect rendering method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |