CN114022605A - Map processing method and device, storage medium and electronic equipment - Google Patents

Map processing method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN114022605A
CN114022605A CN202111290319.9A CN202111290319A CN114022605A CN 114022605 A CN114022605 A CN 114022605A CN 202111290319 A CN202111290319 A CN 202111290319A CN 114022605 A CN114022605 A CN 114022605A
Authority
CN
China
Prior art keywords
fog effect
map
effect
fog
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111290319.9A
Other languages
Chinese (zh)
Inventor
李治民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111290319.9A priority Critical patent/CN114022605A/en
Publication of CN114022605A publication Critical patent/CN114022605A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation

Abstract

The disclosure belongs to the technical field of computer graphics, and relates to a map processing method and device, a storage medium and electronic equipment. The method comprises the following steps: acquiring a first map and a fog effect material, and mixing the first map and the fog effect material to generate a flocculent fog effect; carrying out texture movement treatment on the flocculent fog effect to obtain a fluttering fog effect, and carrying out mixed treatment on the fluttering fog effect and the first mapping to obtain a wiredrawing state fog effect; and carrying out illumination offset treatment on the wiredrawing state fog effect to obtain a scattering fog effect, and carrying out light and shade level adjustment on the scattering fog effect to obtain a target fog effect. The method gives full play to the characteristics of the dot matrix, so that the texture expression of the fog effect has more volume and hierarchy, the more real fog effect is brought under the premise of lower performance consumption, parameter adjustment in a plurality of new directions is added, more expression possibilities are given to the fog effect, the application scene of the fog effect is enriched, and the method can be also suitable for more game art style projects.

Description

Map processing method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer graphics technologies, and in particular, to a map processing method, a map processing apparatus, a computer-readable storage medium, and an electronic device.
Background
When a fog effect scene such as a virtual desert is made, a very large and real special effect such as a sand storm is usually required to be made. The analysis of the demands in the game scene shows that the sandstorm needs to receive illumination, is dynamic and needs to have strong sense of volume. However, current games on the market do not have the effect scheme of the required quality, and most of the effects are achieved by using special effect launching maps in order to save performance.
The sandstorm effect is simulated by the transparent special effect mapping with disturbance and movement emitted by the particle special effect system, although the consumption is low, the requirements of some projects with delicate art expression cannot be met. Specifically, since the manufacturing material algorithm itself has no relationship with the light source information, the edge light generated by the light irradiating the dust body cannot be seen. Secondly, because the dynamic effect of the wind sand needs to be considered, most of the transmitted maps rotate irregularly and randomly, so that the volume feeling is lost or the accurate volume feeling cannot be expressed. More importantly, because the map is used for simulating the sandstorm effect, the interior scene of the sandstorm does not exist, the expression of the sandstorm is not exquisite, the sandstorm cannot be integrated into the terrain, and only 'false' shadow can be made.
In view of the above, there is a need in the art to develop a new method and apparatus for processing a map.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method for processing a map, a device for processing a map, a computer readable storage medium, and an electronic apparatus, so as to overcome the technical problem of poor fog effect due to the limitations of the related art at least to some extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of embodiments of the present invention, there is provided a map processing method, including:
acquiring a first map and a fog effect material, and mixing the first map and the fog effect material to generate a flocculent fog effect;
carrying out texture movement treatment on the flocculent fog effect to obtain a fluttering fog effect, and carrying out mixed treatment on the fluttering fog effect and the first map to obtain a wiredrawing state fog effect;
and carrying out illumination offset treatment on the wiredrawing state fog effect to obtain a scattering fog effect, and carrying out light and shade level adjustment on the scattering fog effect to obtain a target fog effect.
In an exemplary embodiment of the invention, the obtaining the first map and the fog effect material includes:
obtaining a single facet model and obtaining a material chartlet of the single facet model;
and synthesizing the material map to obtain a first map, and obtaining the fog effect material.
In an exemplary embodiment of the present invention, the mixing the first map and the fog effect material to generate a flocculent fog effect includes:
mixing the first map and the fog effect material to obtain an original fog effect, and obtaining a second map;
superposing the second map and the original fog effect to obtain a block fog effect, and acquiring a third map;
and superposing the third mapping and the block-shaped fog effect to obtain a flocculent fog effect.
In an exemplary embodiment of the invention, the method further comprises:
and mixing the fluttering fog effect and the second map to obtain the wiredrawing state fog effect.
In an exemplary embodiment of the present invention, the adjusting the scattering fog effect in a dark and dark layer to obtain a target fog effect includes:
mixing the scattering fog effect and the second mapping to obtain a mixed fog effect, and obtaining a self-luminous mapping;
and adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain a target fog effect.
In an exemplary embodiment of the invention, the method further comprises:
and mixing the fluttering fog effect and the third sticking chart to obtain the wiredrawing state fog effect.
In an exemplary embodiment of the present invention, the adjusting the scattering fog effect in a dark and dark layer to obtain a target fog effect includes:
mixing the scattering fog effect and the third pasting picture to obtain a mixed fog effect, and obtaining a self-luminous pasting picture;
and adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain a target fog effect.
In an exemplary embodiment of the present invention, the performing the light offset treatment on the wiredrawing state fog effect to obtain the scattering fog effect includes:
determining an illumination direction corresponding to the wiredrawing state fog effect, and copying the wiredrawing state fog effect to obtain a backup fog effect;
and carrying out deviation processing on the backup fog effect according to the illumination direction to obtain a deviation fog effect, and carrying out brightness adjustment processing on the deviation fog effect to obtain a brightness fog effect so as to determine that the wiredrawing state fog effect and the brightness fog effect are scattering fog effects.
In an exemplary embodiment of the present invention, the shifting the backup fog effect according to the illumination direction to obtain a shifted fog effect, and adjusting the brightness of the shifted fog effect to obtain a brightness fog effect includes:
carrying out offset processing on the backup fog effect according to the illumination direction to obtain an offset fog effect, and carrying out brightness increasing processing on the offset fog effect to obtain a brightness fog effect; or
And carrying out deviation processing on the backup fog effect according to the opposite direction of the illumination direction to obtain a deviation fog effect, and carrying out brightness reduction processing on the deviation fog effect to obtain a brightness fog effect.
In an exemplary embodiment of the present invention, the adjusting the scattering fog effect in a dark and dark layer to obtain a target fog effect includes:
mixing the scattering fog effect and the first map to obtain a mixed fog effect, and obtaining a self-luminous map;
and adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain a target fog effect.
According to a second aspect of the embodiments of the present invention, there is provided a map processing apparatus including:
the mixing processing module is configured to obtain a first map and a fog effect material, and carry out mixing processing on the first map and the fog effect material to generate flocculent fog effect;
the movement processing module is configured to perform texture movement processing on the flocculent fog effect to obtain a fluttering fog effect, and perform mixing processing on the fluttering fog effect and the first map to obtain a wiredrawing state fog effect;
and the fog effect adjusting module is configured to perform illumination offset processing on the wiredrawing state fog effect to obtain a scattering fog effect, and perform light and shade level adjustment on the scattering fog effect to obtain a target fog effect.
According to a third aspect of embodiments of the present invention, there is provided an electronic apparatus including: a processor and a memory; wherein the memory has stored thereon computer readable instructions, which when executed by the processor, implement the map processing method in any of the above exemplary embodiments.
According to a fourth aspect of embodiments of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a map processing method in any of the above-described exemplary embodiments.
As can be seen from the foregoing technical solutions, the map processing method, the map processing apparatus, the computer storage medium and the electronic device in the exemplary embodiment of the present disclosure have at least the following advantages and positive effects:
in the method and the device provided by the exemplary embodiment of the disclosure, the characteristics of the dot matrix are fully exerted by the mixing treatment and the subsequent mixing treatment of the first map and the fog effect material, so that the texture expression of the fog effect has more volume and layering, and the more real fog effect is brought on the premise of lower performance consumption. Furthermore, parameter adjustment in a plurality of new directions is added when texture moving processing, illumination offset processing and light and shade level adjustment are carried out on the fog effect, more expression possibilities are given to the fog effect, the application scene of the fog effect is enriched, and the method can be suitable for more game art style projects.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 is a schematic diagram showing an effect of a particle effect system simulating a sandstorm effect in the related art;
FIG. 2 is a schematic diagram showing another effect of a particle special effect system simulating a sandstorm effect in the related art;
FIG. 3 schematically illustrates a flow chart of a map processing method in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a flow chart of a method for obtaining a first map and a fog effect material in an exemplary embodiment of the disclosure;
FIG. 5 schematically illustrates a flow diagram of a method of hybrid processing in an exemplary embodiment of the disclosure;
fig. 6 schematically shows a flow diagram of a method of illumination shift processing in an exemplary embodiment of the present disclosure;
fig. 7 schematically illustrates a flow chart of a method of brightness adjustment processing in an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a flow chart of a method of shading gradation adjustment in an exemplary embodiment of the present disclosure;
FIG. 9 schematically illustrates a flow chart of another method of shading adjustment in an exemplary embodiment of the present disclosure;
FIG. 10 is a schematic flow chart diagram illustrating a further method of shading adjustment in an exemplary embodiment of the present disclosure;
FIG. 11 is an interface diagram schematically illustrating a method for obtaining a first map in an application scenario according to an exemplary embodiment of the present disclosure;
fig. 12 is a schematic diagram illustrating an effect of an original fog effect generated in an application scenario in an exemplary embodiment of the present disclosure;
fig. 13 schematically illustrates an effect diagram of a block fog effect in an application scenario in an exemplary embodiment of the present disclosure;
fig. 14 schematically illustrates an effect diagram of a flocculent fog effect obtained by superimposing a third tile in an application scenario in an exemplary embodiment of the present disclosure;
fig. 15 schematically illustrates an effect diagram of a mist effect in a wire drawing state in an application scenario in an exemplary embodiment of the present disclosure;
FIG. 16 is a schematic diagram illustrating the effect of scattering fog effect in an application scenario in an exemplary embodiment of the disclosure;
FIG. 17 is a schematic diagram illustrating the effect of a target fog effect in an application scenario in an exemplary embodiment of the disclosure;
fig. 18 is a schematic diagram illustrating an effect of a sandstorm effect generated in an application scenario in an exemplary embodiment of the present disclosure;
FIG. 19 schematically illustrates a comparison of a dust and sand effect generated from a particle special effect emission map and a dust and sand effect generated by a method of the present disclosure in an exemplary embodiment of the present disclosure;
FIG. 20 is a schematic interface diagram illustrating an application scenario in an exemplary embodiment of the present disclosure in which the chartlet processing method of the present disclosure is used;
FIG. 21 is a schematic diagram illustrating an interface for adjusting parameters in an application scenario in an exemplary embodiment of the present disclosure;
fig. 22 is a schematic structural view schematically illustrating a map processing apparatus in an exemplary embodiment of the present disclosure;
FIG. 23 schematically illustrates an electronic device for implementing a map processing method in an exemplary embodiment of the present disclosure;
fig. 24 schematically illustrates a computer-readable storage medium for implementing a map processing method in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
When a fog effect scene such as a virtual desert is made, a very large and real special effect such as a sandstorm is generally required to be made. The analysis of the demands in the game scene shows that the sandstorm needs to receive illumination, is dynamic and needs to have strong sense of volume. However, current games on the market do not have the effect scheme of the required quality, and most of the effects are achieved by using special effect launching maps in order to save performance.
The sandstorm effect is simulated by the transparent special effect mapping with disturbance and movement emitted by the particle special effect system, although the consumption is low, the requirements of some projects with delicate art expression cannot be met.
Fig. 1 is a schematic diagram illustrating an effect of a particle effect system simulating a sandstorm effect in the related art, and as shown in fig. 1, since a manufacturing material algorithm itself has no relationship with light source information, edge light generated by light irradiating on a sandstorm cannot be seen.
Fig. 2 is a schematic diagram illustrating another effect of a particle special effect system in the related art for simulating a sandstorm effect, as shown in fig. 1, since a chartlet is used for simulating the sandstorm effect, there is no scene inside the sandstorm, the expression of the sandstorm is not fine and smooth, and the sandstorm cannot be merged into the terrain, and only a "false" shadow can be made.
In addition, because the dynamic effect of the wind sand needs to be considered, most of the transmitted maps rotate irregularly and randomly, so that the volume feeling is lost or the accurate volume feeling cannot be expressed.
Therefore, when a very large and real special effect such as a sand storm needs to be made in the process of making fog effect scenes such as a virtual desert, the sand storm needs to be capable of receiving illumination, dynamic and strong in volume sense. However, current games on the market do not have the effect scheme of the required quality, and most of the effects are achieved by using special effect launching maps in order to save performance.
The sandstorm effect is simulated by the transparent special effect mapping with disturbance and movement emitted by the particle special effect system, although the consumption is low, the requirements of some projects with delicate art expression cannot be met. Specifically, since the manufacturing material algorithm itself has no relationship with the light source information, the edge light generated by the light irradiating the dust body cannot be seen. Secondly, because the dynamic effect of the wind sand needs to be considered, most of the transmitted maps rotate irregularly and randomly, so that the volume feeling is lost or the accurate volume feeling cannot be expressed. More importantly, because the map is used for simulating the sandstorm effect, the interior scene of the sandstorm does not exist, the expression of the sandstorm is not exquisite, the sandstorm cannot be integrated into the terrain, and only 'false' shadow can be made.
In order to solve the problems in the related art, the disclosure provides a map processing method. Fig. 3 shows a flow chart of a map processing method, which, as shown in fig. 3, comprises at least the following steps:
and S310, obtaining a first map and a fog effect material, and mixing the first map and the fog effect material to generate a flocculent fog effect.
And S320, carrying out texture movement treatment on the flocculent fog effect to obtain a fluttering fog effect, and carrying out mixed treatment on the fluttering fog effect and the first map to obtain a wiredrawing state fog effect.
And S330, carrying out illumination offset treatment on the wiredrawing state fog effect to obtain a scattering fog effect, and carrying out light and shade level adjustment on the scattering fog effect to obtain a target fog effect.
In the exemplary embodiment of the disclosure, the characteristics of the dot matrix are fully exerted by the mixing treatment and the subsequent mixing treatment of the first map and the fog effect material, so that the texture expression of the fog effect has more volume feeling and layering feeling, and more real fog effect is brought under the premise of lower performance consumption. Furthermore, parameter adjustment in a plurality of new directions is added when texture moving processing, illumination offset processing and light and shade level adjustment are carried out on the fog effect, more expression possibilities are given to the fog effect, the application scene of the fog effect is enriched, and the method can be suitable for more game art style projects.
The respective steps of the map processing method will be described in detail below.
In step S310, a first map and a fog effect material are obtained, and the first map and the fog effect material are mixed to generate a flocculent fog effect.
In an exemplary embodiment of the present disclosure, the first map and the fog effect material may be obtained first.
In an alternative embodiment, fig. 4 shows a schematic flow chart of a method for obtaining a first map and a fog effect material, as shown in fig. 4, the method at least comprises the following steps: in step S410, a single patch model is obtained, and a material map of the single patch model is obtained.
Wherein, the single patch model may be a particle generated by a UE4 Engine (Unreal Engine 4).
It is worth to say that the particle only needs to emit one particle, and does not need to emit a plurality of particles, so as to solve the performance problem of too many particles on the premise of complex materials.
Further, the UE4 engine may be used to obtain a texture map of the single grain, i.e., the single patch model.
In step S420, a first map is obtained by synthesizing the material map, and the fog effect material is obtained.
A UE4 engine is used to synthesize a plurality of material maps with smaller volumes to obtain a 3D (3-dimensional) map, i.e., a first map.
Further, volume material of the UE4 engine is obtained as the fog effect material.
The volume material is a material type used by the volume fog, and can be used for making effects on a lattice (voxel) covering the scene overall. In addition, the volume of the map cloud and the map fog in the game can be increased, so that a more natural scattering picture can be obtained when light penetrates through the cloud fog.
In the exemplary embodiment, the corresponding first mapping can be acquired through a single patch model, the acquisition mode is simple and accurate, and a mapping basis is provided for subsequent invalid generation.
After the first map and the fog effect material are obtained, the first map and the fog effect material can be mixed.
In an alternative embodiment, fig. 5 shows a flow diagram of a method of mixing processing, as shown in fig. 5, the method at least comprising the steps of: in step S510, the first map and the fog effect material are mixed to obtain an original fog effect, and a second map is obtained.
Specifically, the spatial coordinate relationship stored in the first map can be used, and the lattice generated by the fog effect material is expressed in a world space coordinate system, so that the transparent and opaque ranges of the cloud and fog in the world space are identified, and the original fog effect is obtained.
Further, in order to perform superposition processing on the original fog effect, a second map can be obtained. The second map is a different map from the first map.
In step S520, the second map and the original fog effect are superimposed to obtain a block fog effect, and a third map is obtained.
After the world space coordinate system is acquired, the first map may be tiled multiple times, for example 256 times, within the world space coordinate system, and then the first map after being tiled multiple times may be randomly shifted by a constant. The second map is then reduced by a factor of 16 or other factors to increase the degree of repetition. Further, the first mapping after random offset and the second mapping after reduction are input into the lattice information of the fog effect material, so that the material function is processed to obtain the block fog effect.
Although the block-like fogging has been mixed 2 times, the problem of the degree of repetition that is the appearance effect of the light and dark appearance divided by the grid is not solved, and therefore, a third map may be further acquired to continue the superimposition processing using the third map.
It is noted that the third map is a map with more texture details.
In step S530, the third map and the block fog effect are superimposed to obtain a flocculent fog effect.
Specifically, the spatial coordinate relationship stored in the third map may be utilized, and the dot matrix generated by the fog effect material is represented in the world space coordinate system, so as to further mark the transparent and opaque ranges of the cloud and fog in the world space, thereby obtaining the flocculent fog effect.
Moreover, since the superposition processing and the mixing processing are performed in the same manner, the superposition processing is also performed in the same manner as the superposition processing in step S520, and thus, the description thereof is omitted.
In this exemplary embodiment, the first, second, and third maps are used to perform three times of continuous mixing processing on the fog effect material to obtain the flocculent fog effect, so that the detail expression is improved from the light and shade division angle, and the flocculent effect which is the same as the fog effect in reality appears compared with the original fog effect and the massive fog effect, and the fog effect is better in expression.
In step S320, the texture moving process is performed on the flocculent mist effect to obtain a fluttering mist effect, and the fluttering mist effect and the first map are mixed to obtain a wiredrawing mist effect.
In an exemplary embodiment of the present disclosure, after obtaining the fog effect of the cloud, the fog effect of the cloud may be subjected to a texture moving process.
Specifically, the captured time attribute in the UE4 engine is used to move the world coordinate position captured during the previous mixing operation, so that the flocculent fog effect is moved to obtain the fluttering fog effect. The time attribute is an attribute built into the UE4 engine and is a value that can be increased as game time grows.
Where texture moves can be merged by defining 3 attributes. These three attributes are timeoffset, DensityTextureA _ moving, and winDRI, respectively. Timeoffset expresses time offset control, DensityTextureA _ moving defines the speed of cloud layer flutter, and winDRI expresses the direction of flutter.
However, the cloud or dust in real life is not one piece but slightly in a state of being drawn. Therefore, in order to achieve the cloud mist in the real state, the first paste can be used to add a layer of displacement effect, namely time staggered feeling, on the fluttering mist effect so as to obtain the wire-drawing mist effect.
Specifically, the fluttering fog effect can be expressed in a world space coordinate system in a form of a dot matrix generated by the fog effect material by using a spatial coordinate relationship stored in the first map, and the fluttering fog effect is staggered with the previous fog effect to obtain the wiredrawing state fog effect.
Moreover, since the blending process here corresponds to the superimposition process performed again by using the first map, the blending process is also the same as the superimposition process in step S520, and is not described again here.
In addition, the second map or the third map may be used for the mixing process.
In an alternative embodiment, the fluttering fog effect and the second map are mixed to obtain the wiredrawing fog effect.
Specifically, the fluttering fog effect can be expressed in a world space coordinate system in a form of a dot matrix generated by the fog effect material by using a space coordinate relationship stored in the second map, and the fluttering fog effect is staggered with the previous fog effect to obtain the wiredrawing fog effect.
Moreover, since the blending process here corresponds to the second overlay process performed again by using the second map, the blending process is also the same as the process of the overlay process in step S520, and is not described again here.
In an alternative embodiment, the fluttering fog effect and the third sticking chart are mixed to obtain the wiredrawing state fog effect.
Specifically, the fluttering fog effect can be expressed in a world space coordinate system in a form of a dot matrix generated by the fog effect material by using a space coordinate relationship stored in the third map, and the fluttering fog effect is staggered with the previous fog effect to obtain the wiredrawing state fog effect.
Further, since the mixing process here corresponds to the superimposition process performed again by the third tile, the mixing process is also the same as the superimposition process in step S520, and is not described again here.
In the present exemplary embodiment, in order to obtain the mist effect in the string state, the first, second, or third maps may be mixed with the fluttering mist effect, and the map of the mixed process is a previously used map, so that the consumption of performance is not caused. In addition, any one of the three used maps can be selected according to the desired effect of the wiredrawing state, so that various selection and generation effects are provided for the mixing treatment.
In step S330, the light offset processing is performed on the drawn-wire fog effect to obtain a scattering fog effect, and the light and shade gradation adjustment is performed on the scattering fog effect to obtain a target fog effect.
In exemplary embodiments of the present disclosure, after obtaining the wiredrawing haze effect, the wiredrawing haze effect may be subjected to an illumination offset process.
In an alternative embodiment, fig. 6 shows a flow diagram of a method of illumination shift processing, as shown in fig. 6, the method at least comprising the steps of: in step S610, the illumination direction corresponding to the wiredrawing state fog effect is determined, and the wiredrawing state fog effect is copied to obtain a backup fog effect.
The illumination direction may be an illumination direction of captured parallel light in the engine of the UE4, or other illumination directions may be set according to actual situations, which is not particularly limited in this exemplary embodiment.
Further, the obtained wiredrawing state fog effect is copied to be used as a backup fog effect.
In step S620, offset processing is performed on the backup fog effect according to the illumination direction to obtain an offset fog effect, and brightness adjustment processing is performed on the offset fog effect to obtain a brightness fog effect, so as to determine the wire-drawing fog effect and the brightness fog effect as scattering fog effects.
In an alternative embodiment, fig. 7 shows a flow diagram of a method of brightness adjustment processing, which, as shown in fig. 7, at least comprises the following steps: in step S710, the backup fog effect is shifted according to the illumination direction to obtain a shifted fog effect, and the shifted fog effect is increased in brightness to obtain a brightness fog effect.
And shifting the copied backup fog effect to a certain degree towards the illumination direction to obtain the shifted fog effect. Specifically, the offset mist effect may be moved above the mist effect in the string state.
Further, the brightness of the offset fog effect is adjusted so that the brightness of the offset fog effect is increased to obtain the brightness fog effect.
In step S720, offset processing is performed on the backup fog effect in the opposite direction of the illumination direction to obtain an offset fog effect, and brightness reduction processing is performed on the offset fog effect to obtain a brightness fog effect.
And shifting the copied backup fog effect to the opposite direction of the illumination direction to a certain degree to obtain the shifted fog effect. Specifically, the offset mist effect may be moved to a position below the mist effect in the string state.
Further, the brightness of the offset fogging effect is adjusted so that the brightness of the offset fogging effect is reduced to obtain the brightness fogging effect.
In the exemplary embodiment, differential offset processing and brightness adjustment processing are performed on the backup fog effect according to the illumination direction to obtain the corresponding brightness fog effect, the processing mode is simple and accurate, the display of the fog effect is linked with the light source information and is consistent with the real scene, the edge light effect generated by the light on the fog effect is obtained, and the vividness and the reality of the brightness fog effect are increased.
After the lightness haze effect is obtained, the lightness haze effect and the wire-drawing state haze effect can be used as scattering haze effects to present a scattering state.
After the scattering fog effect is obtained, the light and shade gradation of the scattering fog effect can be adjusted.
In an alternative embodiment, fig. 8 shows a flow chart of a method for adjusting a shading level, and as shown in fig. 8, the method at least includes the following steps: in step S810, the scattering fog effect and the first map are mixed to obtain a mixed fog effect, and a self-luminous map is obtained.
Specifically, the mixed fog effect can be obtained by expressing the scattering fog effect in a world space coordinate system in a lattice form generated by the fog effect material by using a spatial coordinate relationship stored in the first map.
Further, a self-luminous map can be obtained.
The self-luminous map (Emissive) can control the color and brightness of light emitted by the surface. When the self-luminous map is used in the scene, the scene looks like a visible light, and the surface of the object also has the self-luminous effect.
In step S820, the light and shade gradation of the mixed fog effect and the self-luminous map is adjusted to obtain a target fog effect.
Specifically, a position which is lower than the mixed fog effect on a world space coordinate system is obtained, all spaces of the lower position are controlled to be lightened through the self-luminous map, and the brightening effect can be that partial areas are lightened and partial areas are not lightened. That is, the transparent part of the mixed fog effect can be brightened through the self-luminous map at the lower layer, and the part with low transparency can not be brightened or can not be brightened through the self-luminous map to a certain extent, so that the visual brightening effect is achieved, and the target fog effect is obtained.
In the exemplary embodiment, the light and shade gradation of the scattering fog effect is adjusted through the first map and the self-luminous map to obtain the corresponding target fog effect, so that the target fog effect is presented with more volume feeling and gradation feeling, and the effect of cloud and fog or dust can be more truly presented. In addition, the target fog effect has more expression possibilities, and provides basis for the subsequent display of more game art styles.
In addition, the adjustment of the light and shade gradation of the scattering fog effect can be realized through a second map and a third map.
In an alternative embodiment, fig. 9 shows a flow chart of another method for adjusting a shading level, and as shown in fig. 9, the method at least includes the following steps: in step S910, the scattering fog effect and the second map are mixed to obtain a mixed fog effect, and a self-luminous map is obtained.
Specifically, the mixed fog effect can be obtained by expressing the scattering fog effect in a world space coordinate system in a lattice form generated by the fog effect material by using a space coordinate relationship stored in the second map.
Further, a self-luminous map can be obtained. The self-luminous map can control the color and brightness of light emitted by the surface. When the self-luminous map is used in the scene, the scene looks like a visible light, and the surface of the object also has the self-luminous effect.
In step S920, the light and shade gradation of the mixed fog effect and the self-luminous map is adjusted to obtain a target fog effect.
Specifically, a position which is lower than the mixed fog effect on a world space coordinate system is obtained, all spaces of the lower position are controlled to be lightened through the self-luminous map, and the brightening effect can be that partial areas are lightened and partial areas are not lightened. That is, the transparent part of the mixed fog effect can be brightened through the self-luminous map at the lower layer, and the part with low transparency can not be brightened or can not be brightened through the self-luminous map to a certain extent, so that the visual brightening effect is achieved, and the target fog effect is obtained.
In the exemplary embodiment, the light and shade gradation of the scattering fog effect is adjusted through the second map and the self-luminous map to obtain the corresponding target fog effect, so that the target fog effect is presented with more volume sense and gradation sense, and the effect of cloud and fog or dust can be more truly presented. In addition, the target fog effect has more expression possibilities, and provides basis for the subsequent display of more game art styles.
In an alternative embodiment, fig. 10 shows a flow chart of another method for adjusting a shading level, and as shown in fig. 10, the method at least includes the following steps: in step S1010, the scattering fog effect and the third map are mixed to obtain a mixed fog effect, and a self-luminous map is obtained.
Specifically, the mixed fog effect can be obtained by expressing the scattering fog effect in a world space coordinate system in a form of a dot matrix generated by the fog effect material by using a spatial coordinate relationship stored in the third map.
Further, a self-luminous map can be obtained. The self-luminous map can control the color and brightness of light emitted by the surface. When the self-luminous map is used in the scene, the scene looks like a visible light, and the surface of the object also has the self-luminous effect.
In step S1020, the light and shade gradation of the mixed fog effect and the self-luminous map is adjusted to obtain a target fog effect.
Specifically, a position which is lower than the mixed fog effect on a world space coordinate system is obtained, all spaces of the lower position are controlled to be lightened through the self-luminous map, and the brightening effect can be that partial areas are lightened and partial areas are not lightened. That is, the transparent part of the mixed fog effect can be brightened through the self-luminous map at the lower layer, and the part with low transparency can not be brightened or can not be brightened through the self-luminous map to a certain extent, so that the visual brightening effect is achieved, and the target fog effect is obtained.
In the exemplary embodiment, the scattering fog effect is subjected to light and shade level adjustment through the third map and the self-luminous map to obtain the corresponding target fog effect, so that the target fog effect is presented more volumetrically and hierarchically, and the cloud fog or dust effect can be more truly presented. In addition, the target fog effect has more expression possibilities, and provides basis for the subsequent display of more game art styles.
The following describes the mapping processing method in the embodiment of the present disclosure in detail with reference to an application scenario.
Fig. 11 is an interface diagram illustrating a method for obtaining a first mapping in an application scenario, and as shown in fig. 11, a single patch model is obtained, and a material mapping of the single patch model is obtained.
Wherein a single patch model may be a particle generated by the UE4 engine emission.
It is worth to say that the particle only needs to emit one particle, and does not need to emit a plurality of particles, so as to solve the performance problem of too many particles on the premise of complex materials.
Further, the UE4 engine may be used to obtain a texture map of the single grain, i.e., the single patch model.
And synthesizing the material map to obtain a first map.
A UE4 engine is used to synthesize a plurality of material maps with smaller volumes to obtain a 3D map, i.e. a first map.
Fig. 12 is a schematic diagram illustrating an effect of an original fog effect generated in an application scenario, and as shown in fig. 12, a volume material owned by a UE4 engine is further obtained as the fog effect material.
The volume material is a material type used by the volume fog, and can be used for making effects on a lattice (voxel) covering the scene overall. In addition, the volume of the map cloud and the map fog in the game can be increased, so that a more natural scattering picture can be obtained when light penetrates through the cloud fog.
After the first map and the fog effect material are obtained, the first map and the fog effect material can be mixed.
Specifically, the spatial coordinate relationship stored in the first map can be used, and the lattice generated by the fog effect material is expressed in a world space coordinate system, so that the transparent and opaque ranges of the cloud and fog in the world space are identified, and the original fog effect is obtained.
Obviously, the surface of the fog effect material has no light and shade division, and the original fog effect can present more obvious light and shade division after being mixed with the first map.
However, the repeatability of the original fog effect at the present stage, that is, the light and shade division is too obvious as being divided by the grid, is still too high, so that a layer of 3D map can be superimposed to eliminate the too high repeatability.
In order to perform superposition processing on the original fog effect, a second mapping can be obtained. The second map is a different map from the first map.
And the second mapping and the original fog effect are superposed to obtain the block fog effect.
Specifically, after the world space coordinate system is acquired, the first map may be tiled multiple times, for example 256 times, in the world space coordinate system, and then the first map after being tiled multiple times may be randomly shifted by a constant. The second map is then reduced by a factor of 16 or other factors to increase the degree of repetition. Further, the first mapping after random offset and the second mapping after reduction are input into the lattice information of the fog effect material, so that the material function is processed to obtain the block fog effect.
Fig. 13 is a schematic diagram illustrating the effect of the block fog effect in an application scenario, and as shown in fig. 13, the block fog effect has been mixed for 2 times, but the problem of too high repetition degree is not solved, so a third map may be further obtained to continue the overlay processing using the third map.
Fig. 14 is a schematic diagram illustrating the effect of the flocculent fog effect obtained by superimposing a third tile in the application scene, where the third tile is a tile with more detailed texture, as shown in fig. 14.
And superposing the third mapping and the block fog effect to obtain the flocculent fog effect.
Specifically, the spatial coordinate relationship stored in the third map may be utilized, and the dot matrix generated by the fog effect material is represented in the world space coordinate system, so as to further mark the transparent and opaque ranges of the cloud and fog in the world space, thereby obtaining the flocculent fog effect.
Compared with the massive fog effect, the flocculent fog effect can show a flocculent effect closer to the reality.
Further, the basic information of the flocculent fog effect can be controlled by more parameters in the UE4 engine.
The method for controlling the parameters of the flocculent fog effect in the application scene can be that the superposition repetition degree and the brightness of the flocculent fog effect are specified by the two parameters. Each layer is increased by 2 more parameters, so that after three times of mixing, 6 more parameters are increased.
The control of the repetition degree can be realized by a tilting parameter, and the brightness can be realized by a parameter of the fineness degree of the texture. Further, since the brightness is related to the height of the fogging effect, the control can be performed by adding a parameter of the height of the fogging effect.
Because the flocculent fog effect is more like a white cloud at the present stage, the flocculent fog effect can be further subjected to texture moving treatment in order to make the flocculent fog effect move.
The texture moving method in the application scene may be to use the time attribute in the captured UE4 engine to move the world coordinate position captured during the previous mixing operation, so that the flocculent fog effect is moved to obtain the fluttering fog effect. The time attribute is an attribute built into the UE4 engine and is a value that can be increased as game time grows.
Where texture moves can be merged by defining 3 attributes. These three attributes are timeoffset, DensityTextureA _ moving, and winDRI, respectively. Timeoffset expresses time offset control, DensityTextureA _ moving defines the speed of cloud layer flutter, and winDRI expresses the direction of flutter.
Of course, the cloud or dust in real life is not one piece, but has a little stringing state. Therefore, in order to achieve the cloud mist in the real state, the previous paste can be used to add a layer of replacement effect, namely time staggered feeling, on the fluttering mist effect so as to obtain the wire-drawing mist effect.
Because the cloud mist is fake at the same moving speed, the moving speed of each layer of the fluttering mist effect needs to capture the original world coordinate position, and then a 3D map is mixed to set different time, so that the cloud mist effect and the previous mist effect are staggered to form the wire-drawing state mist effect in the wire-drawing state.
It should be noted that the mixed 3D map may be any one of the first map, the second map, or the third map that has been used before, and may be determined according to a desired drawing state, and since the map that has been used before is used, no performance consumption is caused.
Specifically, the fluttering fog effect can be expressed in a world space coordinate system in a form of a dot matrix generated by the fog effect material by using a spatial coordinate relationship stored in the first map, and the fluttering fog effect is staggered with the previous fog effect to obtain the wiredrawing state fog effect.
Or the fluttering fog effect can be expressed in a world space coordinate system in a form of a dot matrix generated by the fog effect material by utilizing a space coordinate relation stored in the second map, and the fluttering fog effect is staggered with the previous fog effect to obtain the wiredrawing state fog effect.
And the fluttering fog effect can be expressed in a world space coordinate system in a form of a dot matrix generated by the fog effect material by utilizing a space coordinate relation stored in the third map, and the fluttering fog effect is staggered with the previous fog effect to obtain the wire-drawing fog effect.
Fig. 15 is a schematic diagram illustrating the effect of the mist effect in the wire-drawing state in an application scenario, and as shown in fig. 15, a layer of replacement effect is added by the mixing treatment of the fluttering mist effect and the first map, so that the fluttering mist effect in the original state of being "false" shows the wire-drawing state.
Further, in order to overcome the problem of the volume feeling of the most difficult cloud or dust, the method can be overcome from the aspects of illumination expression and layering expression.
The method of determining the illumination direction in the application scenario may be to grab the illumination direction of the parallel light inside the engine with the UE4 engine.
The method for obtaining the brightness fog effect corresponding to the wiredrawing state fog effect in the application scene can be that the obtained wiredrawing state fog effect is firstly copied to be used as the backup fog effect.
And then, carrying out offset processing on the backup fog effect according to the illumination direction to obtain an offset fog effect, and carrying out brightness adjustment processing on the offset fog effect to obtain a brightness fog effect.
Firstly, offset processing is carried out on the backup fog effect according to the illumination direction to obtain an offset fog effect, and brightness increasing processing is carried out on the offset fog effect to obtain a brightness fog effect.
And shifting the copied backup fog effect to a certain degree towards the illumination direction to obtain the shifted fog effect. Specifically, the offset mist effect may be moved above the mist effect in the string state.
Further, the brightness of the offset fog effect is adjusted so that the brightness of the offset fog effect is increased to obtain the brightness fog effect.
And secondly, carrying out offset processing on the backup fog effect according to the opposite direction of the illumination direction to obtain an offset fog effect, and carrying out brightness reduction processing on the offset fog effect to obtain a brightness fog effect.
And shifting the copied backup fog effect to the opposite direction of the illumination direction to a certain degree to obtain the shifted fog effect. Specifically, the offset mist effect may be moved to a position below the mist effect in the string state.
Further, the brightness of the offset fogging effect is adjusted so that the brightness of the offset fogging effect is reduced to obtain the brightness fogging effect.
After the lightness haze is obtained, the lightness haze and the wire-drawing haze can be used as scattering haze.
Fig. 16 is a schematic diagram illustrating the effect of the scattering fog effect in an application scenario, and as shown in fig. 16, the scattering fog effect can simulate edge light refracted by the cloud fog when the cloud fog is irradiated by light, so as to present a scattering effect.
After the scattering fog effect is obtained, the light and shade gradation of the scattering fog effect can be adjusted.
The light and shade gradation adjustment method in the application scene can be used for mixing the scattering fog effect and the first map to obtain a mixed fog effect and acquiring a self-luminous map.
Specifically, the mixed fog effect can be obtained by expressing the scattering fog effect in a world space coordinate system in a lattice form generated by the fog effect material by using a spatial coordinate relationship stored in the first map.
Further, a self-luminous map can be obtained.
The self-luminous map can control the color and brightness of light emitted by the surface. When the self-luminous map is used in the scene, the scene looks like a visible light, and the surface of the object also has the self-luminous effect.
And adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain the target fog effect.
Specifically, a position which is lower than the mixed fog effect on a world space coordinate system is obtained, all spaces of the lower position are controlled to be lightened through the self-luminous map, and the brightening effect can be that partial areas are lightened and partial areas are not lightened. That is, the transparent part of the mixed fog effect can be brightened through the self-luminous map at the lower layer, and the part with low transparency can not be brightened or can not be brightened through the self-luminous map to a certain extent, so that the visual brightening effect is achieved, and the target fog effect is obtained.
In addition, the adjustment of the light and shade gradation of the scattering fog effect can be realized through a second map and a third map.
And mixing the scattering fog effect and the second mapping to obtain a mixed fog effect, and obtaining a self-luminous mapping.
Specifically, the mixed fog effect can be obtained by expressing the scattering fog effect in a world space coordinate system in a lattice form generated by the fog effect material by using a space coordinate relationship stored in the second map.
Further, a self-luminous map can be obtained. The self-luminous map can control the color and brightness of light emitted by the surface. When the self-luminous map is used in the scene, the scene looks like a visible light, and the surface of the object also has the self-luminous effect.
And adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain the target fog effect.
Specifically, a position which is lower than the mixed fog effect on a world space coordinate system is obtained, all spaces of the lower position are controlled to be lightened through the self-luminous map, and the brightening effect can be that partial areas are lightened and partial areas are not lightened. That is, the transparent part of the mixed fog effect can be brightened through the self-luminous map at the lower layer, and the part with low transparency can not be brightened or can not be brightened through the self-luminous map to a certain extent, so that the visual brightening effect is achieved, and the target fog effect is obtained.
And mixing the scattering fog effect and the third pasting picture to obtain a mixed fog effect, and obtaining a self-luminous pasting picture.
Specifically, the mixed fog effect can be obtained by expressing the scattering fog effect in a world space coordinate system in a form of a dot matrix generated by the fog effect material by using a spatial coordinate relationship stored in the third map.
Further, a self-luminous map can be obtained.
The self-luminous map can control the color and brightness of light emitted by the surface. When the self-luminous map is used in the scene, the scene looks like a visible light, and the surface of the object also has the self-luminous effect.
And adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain the target fog effect.
Specifically, a position which is lower than the mixed fog effect on a world space coordinate system is obtained, all spaces of the lower position are controlled to be lightened through the self-luminous map, and the brightening effect can be that partial areas are lightened and partial areas are not lightened. That is, the transparent part of the mixed fog effect can be brightened through the self-luminous map at the lower layer, and the part with low transparency can not be brightened or can not be brightened through the self-luminous map to a certain extent, so that the visual brightening effect is achieved, and the target fog effect is obtained.
Fig. 17 is a schematic diagram illustrating the effect of the target fog effect in an application scene, and as shown in fig. 17, a pure black area is the target fog effect, and a white area is a self-luminous map effect which shines through, so that the target fog effect is more layered through the transmission effect of the self-luminous map.
Fig. 18 is a schematic diagram illustrating an effect of a sandstorm effect generated in an application scenario, and as shown in fig. 18, after a target fog effect is generated, parameters such as sensitivity, complexity, offset, cloud color, and the like of the target fog effect can be finely adjusted according to actual requirements, so that a final sandstorm expression effect can be obtained.
Fig. 19 is a comparison graph of the dust effect generated according to the particle special effect emission map and the dust effect generated by the method of the present disclosure, and as shown in fig. 19, the transparent special effect map with disturbance and movement is emitted by the particle special effect system to simulate the sand storm effect, although the consumption is low, the transparent special effect map cannot meet the project requirements of more exquisite art performance. Specifically, since the manufacturing material algorithm itself has no relationship with the light source information, the edge light generated by the light irradiating the dust body cannot be seen. Secondly, because the dynamic effect of the wind sand needs to be considered, most of the transmitted maps rotate irregularly and randomly, so that the volume feeling is lost or the accurate volume feeling cannot be expressed. More importantly, because the map is used for simulating the sandstorm effect, the interior scene of the sandstorm does not exist, the expression of the sandstorm is not exquisite, the sandstorm cannot be integrated into the terrain, and only 'false' shadow can be made.
However, the sand effect generated according to the method disclosed by the invention fully exerts the dot matrix characteristic used in multiple mixing treatment, so that the sand effect has more volume and hierarchy senses, and the effect of more real sand storm is realized. In addition, a plurality of adjustment parameters in a new direction are added in the process of charting, so that the dust can have more expression possibility and can be suitable for more game art-style items. The method is also a mode for expressing more real dust effect on the premise of not consuming excessive performance.
Fig. 20 is a schematic diagram illustrating an interface using the method for processing a tile according to the present disclosure in an application scenario, and as shown in fig. 20, different 3D tiles can be produced according to requirements of different game art styles. For example, when making a cloud, since the cloud is soft, there is no need for very small black and white pieces of 3D mapping; when making dust, then the black and white blocks that need the 3D map are denser.
Further, the generated 3D patch may be used to replace the previous patch contained in the UE4 engine and applied to the niagara system to launch a particle.
Where the Niagara module has been added to the engine's plug-in the latest UE4 engine. When the method is needed, only a plug window is opened through the menu Edit- > plug, Niagara is searched, and Enable is selected. Clicking on the right mouse button in the Content Browser then causes a new menu option FX to appear, which includes the various resource options of the Niagara system. Among the various resources included in the Niagara System, the Niagara System is a particle System resource, the Niagara Emitter is a transmitter resource, the Niagara Script is a particle System logic Script resource, and the Niagara Parameter Collection is a Parameter set resource.
The Niagara System resource is similar to the Particle System resource in Cascade. The creation mode can be created through a right-click menu 'FX- > Niagara System', when the Niagara System is dragged into a scene, the engine automatically creates a Niagara Actor and refers to the Niagara System resource in the Niagara Component. Double clicking on the Niagara System resource opens the editor for editing.
The Niagara system provides an editing mode similar to that of the BluePrint, so that the particle system is more convenient and flexible to manufacture. The particle emitter can be controlled, and meanwhile, the particles can be controlled by a user, so that the motion state of the particles can be customized by the user, and a more complex particle special effect can be manufactured.
Fig. 21 is a schematic diagram of an interface for adjusting parameters in an application scenario, and as shown in fig. 21, after a particle is transmitted on the niagara system, parameters to be adjusted may be specified and adjusted on a parameter table of a shader (material). And finally, applying the shader to the particle material emitted by the niagara system to obtain the target fog effect.
Wherein the object when presented comprises, in addition to the body, an inherent color and texture. Texture also determines what material the object is made of.
Based on this, when the effect of cloud or dust is achieved by using the special effect emission map in the related art, because the manufacturing material algorithm and the light source information do not have any relation, the edge light generated by the light irradiating on the dust cannot be seen. Secondly, because the dynamic effect of the wind sand needs to be considered, most of the transmitted maps rotate irregularly and randomly, so that the volume feeling is lost or the accurate volume feeling cannot be expressed. More importantly, because the map is used for simulating the sandstorm effect, the interior scene of the sandstorm does not exist, the expression of the sandstorm is not exquisite, the sandstorm cannot be integrated into the terrain, and only 'false' shadow can be made.
In the chartlet processing method in the exemplary embodiment of the disclosure, the characteristics of the dot matrix are fully exerted by the mixing processing and the subsequent mixing processing of the first chartlet and the fog effect material, so that the texture expression of the fog effect has more volume feeling and layering feeling, and more real fog effect is brought on the premise of lower performance consumption. And on the aspect of sand and dust effect expression, the landform can be blended, so that the sand and dust expression is more exquisite. Furthermore, parameter adjustment in a plurality of new directions is added when texture moving processing, illumination offset processing and light and shade level adjustment are carried out on the fog effect, more expression possibilities are given to the fog effect, the application scene of the fog effect is enriched, and the method can be suitable for more game art style projects.
In addition, in an exemplary embodiment of the present disclosure, a map processing apparatus is also provided. Fig. 22 shows a schematic configuration diagram of a map processing apparatus, and as shown in fig. 22, a map processing apparatus 2900 may include: a mixing processing module 2210, a moving processing module 2220, and a fog effect adjustment module 2230. Wherein:
the mixing processing module 2210 is configured to obtain the first map and the fog effect material, and perform mixing processing on the first map and the fog effect material to generate a flocculent fog effect; the movement processing module 2220 is configured to perform texture movement processing on the flocculent mist effect to obtain a fluttering mist effect, and perform mixing processing on the fluttering mist effect and the first map to obtain a wiredrawing state mist effect; and the fog effect adjusting module 2230 is configured to perform light offset processing on the wiredrawing state fog effect to obtain a scattering fog effect, and perform light and shade level adjustment on the scattering fog effect to obtain a target fog effect.
In an exemplary embodiment of the invention, the obtaining the first map and the fog effect material includes:
obtaining a single facet model and obtaining a material chartlet of the single facet model;
and synthesizing the material map to obtain a first map, and obtaining the fog effect material.
In an exemplary embodiment of the present invention, the mixing the first map and the fog effect material to generate a flocculent fog effect includes:
mixing the first map and the fog effect material to obtain an original fog effect, and obtaining a second map;
superposing the second map and the original fog effect to obtain a block fog effect, and acquiring a third map;
and superposing the third mapping and the block-shaped fog effect to obtain a flocculent fog effect.
In an exemplary embodiment of the invention, the method further comprises:
and mixing the fluttering fog effect and the second map to obtain the wiredrawing state fog effect.
In an exemplary embodiment of the present invention, the adjusting the scattering fog effect in a dark and dark layer to obtain a target fog effect includes:
mixing the scattering fog effect and the second mapping to obtain a mixed fog effect, and obtaining a self-luminous mapping;
and adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain a target fog effect.
In an exemplary embodiment of the invention, the method further comprises:
and mixing the fluttering fog effect and the third sticking chart to obtain the wiredrawing state fog effect.
In an exemplary embodiment of the present invention, the adjusting the scattering fog effect in a dark and dark layer to obtain a target fog effect includes:
mixing the scattering fog effect and the third pasting picture to obtain a mixed fog effect, and obtaining a self-luminous pasting picture;
and adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain a target fog effect.
In an exemplary embodiment of the present invention, the performing the light offset treatment on the wiredrawing state fog effect to obtain the scattering fog effect includes:
determining an illumination direction corresponding to the wiredrawing state fog effect, and copying the wiredrawing state fog effect to obtain a backup fog effect;
and carrying out deviation processing on the backup fog effect according to the illumination direction to obtain a deviation fog effect, and carrying out brightness adjustment processing on the deviation fog effect to obtain a brightness fog effect so as to determine that the wiredrawing state fog effect and the brightness fog effect are scattering fog effects.
In an exemplary embodiment of the present invention, the shifting the backup fog effect according to the illumination direction to obtain a shifted fog effect, and adjusting the brightness of the shifted fog effect to obtain a brightness fog effect includes:
carrying out offset processing on the backup fog effect according to the illumination direction to obtain an offset fog effect, and carrying out brightness increasing processing on the offset fog effect to obtain a brightness fog effect; or
And carrying out deviation processing on the backup fog effect according to the opposite direction of the illumination direction to obtain a deviation fog effect, and carrying out brightness reduction processing on the deviation fog effect to obtain a brightness fog effect.
In an exemplary embodiment of the present invention, the adjusting the scattering fog effect in a dark and dark layer to obtain a target fog effect includes:
mixing the scattering fog effect and the first map to obtain a mixed fog effect, and obtaining a self-luminous map;
and adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain a target fog effect.
The details of the map processing apparatus 2200 described above have been described in detail in the corresponding map processing method, and therefore are not described herein again.
It should be noted that although several modules or units of the map processing apparatus 2200 are mentioned in the above detailed description, such division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
An electronic device 2300 according to such an embodiment of the invention is described below with reference to fig. 23. The electronic device 2300 shown in fig. 23 is only an example and should not bring any limitation to the function and the scope of use of the embodiment of the present invention.
As shown in fig. 23, electronic device 2300 is embodied in the form of a general purpose computing device. Components of electronic device 2300 may include, but are not limited to: the at least one processing unit 2310, the at least one memory unit 2320, a bus 2330 connecting the various system components (including the memory unit 2320 and the processing unit 2310), and a display unit 2340.
Wherein the memory unit stores program code, which is executable by the processing unit 2310, to cause the processing unit 2310 to perform steps according to various exemplary embodiments of the present invention as described in the "exemplary methods" section above of this specification.
The storage unit 2320 may include readable media in the form of volatile storage units, such as a random access storage unit (RAM)2321 and/or a cache storage unit 2322, and may further include a read-only storage unit (ROM) 2323.
Storage unit 2320 may also include a program/utility 2324 having a set (at least one) of program modules 2325, such program modules 2325 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 2330 can be any one or more of several types of bus structures including a memory cell bus or memory cell controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 2300 may also communicate with one or more external devices 2500 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 2300, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 2300 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 2350. Also, the electronic device 2300 can communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 2360. As shown, network adapter 2360 communicates with the other modules of electronic device 2300 over bus 2330. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with electronic device 2300, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above-mentioned "exemplary methods" section of the present description, when said program product is run on the terminal device.
Referring to fig. 24, a program product 2400 for implementing the above-described method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (13)

1. A method for processing a map, the method comprising:
acquiring a first map and a fog effect material, and mixing the first map and the fog effect material to generate a flocculent fog effect;
carrying out texture movement treatment on the flocculent fog effect to obtain a fluttering fog effect, and carrying out mixed treatment on the fluttering fog effect and the first map to obtain a wiredrawing state fog effect;
and carrying out illumination offset treatment on the wiredrawing state fog effect to obtain a scattering fog effect, and carrying out light and shade level adjustment on the scattering fog effect to obtain a target fog effect.
2. The method of claim 1, wherein the obtaining the first map and the fog effect material comprises:
obtaining a single facet model and obtaining a material chartlet of the single facet model;
and synthesizing the material map to obtain a first map, and obtaining the fog effect material.
3. The method for processing the paste chart according to claim 1, wherein the mixing the first paste chart and the fog effect material to generate flocculent fog effect comprises:
mixing the first map and the fog effect material to obtain an original fog effect, and obtaining a second map;
superposing the second map and the original fog effect to obtain a block fog effect, and acquiring a third map;
and superposing the third mapping and the block-shaped fog effect to obtain a flocculent fog effect.
4. The map processing method according to claim 3, wherein the method further comprises:
and mixing the fluttering fog effect and the second map to obtain the wiredrawing state fog effect.
5. The map processing method according to claim 4, wherein the adjusting of the light and shade gradation of the scattering fog effect to obtain the target fog effect comprises:
mixing the scattering fog effect and the second mapping to obtain a mixed fog effect, and obtaining a self-luminous mapping;
and adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain a target fog effect.
6. The map processing method according to claim 3, wherein the method further comprises:
and mixing the fluttering fog effect and the third sticking chart to obtain the wiredrawing state fog effect.
7. The map processing method according to claim 6, wherein the adjusting of the light and shade gradation of the scattering fog effect to obtain the target fog effect comprises:
mixing the scattering fog effect and the third pasting picture to obtain a mixed fog effect, and obtaining a self-luminous pasting picture;
and adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain a target fog effect.
8. The chartlet processing method of claim 1, wherein the obtaining of the scattering fog effect by performing the light offset processing on the wiredrawing fog effect comprises:
determining an illumination direction corresponding to the wiredrawing state fog effect, and copying the wiredrawing state fog effect to obtain a backup fog effect;
and carrying out deviation processing on the backup fog effect according to the illumination direction to obtain a deviation fog effect, and carrying out brightness adjustment processing on the deviation fog effect to obtain a brightness fog effect so as to determine that the wiredrawing state fog effect and the brightness fog effect are scattering fog effects.
9. The method for processing the map according to claim 8, wherein the shifting the backup fog effect according to the illumination direction to obtain a shifted fog effect, and performing brightness adjustment on the shifted fog effect to obtain a brightness fog effect comprises:
carrying out offset processing on the backup fog effect according to the illumination direction to obtain an offset fog effect, and carrying out brightness increasing processing on the offset fog effect to obtain a brightness fog effect; or
And carrying out deviation processing on the backup fog effect according to the opposite direction of the illumination direction to obtain a deviation fog effect, and carrying out brightness reduction processing on the deviation fog effect to obtain a brightness fog effect.
10. The map processing method according to claim 1, wherein the adjusting of the light and shade gradation of the scattering fog effect to obtain a target fog effect comprises:
mixing the scattering fog effect and the first map to obtain a mixed fog effect, and obtaining a self-luminous map;
and adjusting the light and shade levels of the mixed fog effect and the self-luminous map to obtain a target fog effect.
11. A map processing apparatus, characterized by comprising:
the mixing processing module is configured to obtain a first map and a fog effect material, and carry out mixing processing on the first map and the fog effect material to generate flocculent fog effect;
the movement processing module is configured to perform texture movement processing on the flocculent fog effect to obtain a fluttering fog effect, and perform mixing processing on the fluttering fog effect and the first map to obtain a wiredrawing state fog effect;
and the fog effect adjusting module is configured to perform illumination offset processing on the wiredrawing state fog effect to obtain a scattering fog effect, and perform light and shade level adjustment on the scattering fog effect to obtain a target fog effect.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the map processing method according to any one of claims 1 to 10.
13. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the map processing method of any of claims 1-10 via execution of the executable instructions.
CN202111290319.9A 2021-11-02 2021-11-02 Map processing method and device, storage medium and electronic equipment Pending CN114022605A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111290319.9A CN114022605A (en) 2021-11-02 2021-11-02 Map processing method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111290319.9A CN114022605A (en) 2021-11-02 2021-11-02 Map processing method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN114022605A true CN114022605A (en) 2022-02-08

Family

ID=80059835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111290319.9A Pending CN114022605A (en) 2021-11-02 2021-11-02 Map processing method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114022605A (en)

Similar Documents

Publication Publication Date Title
US6606092B2 (en) System and method for generating and using systems of cooperating and encapsulated shaders and shader DAGs for use in a computer graphics system
MXPA06012368A (en) Integration of three dimensional scene hierarchy into two dimensional compositing system.
CN105447906A (en) Method for calculating lighting parameters and carrying out relighting rendering based on image and model
SG195560A1 (en) Techniques and workflows for computer graphics animation system
CN111968216A (en) Volume cloud shadow rendering method and device, electronic equipment and storage medium
Li et al. Multivisual animation character 3D model design method based on VR technology
CN115049811B (en) Editing method, system and storage medium for digital twin virtual three-dimensional scene
WO2008014384A2 (en) Real-time scenery and animation
CN112891946A (en) Game scene generation method and device, readable storage medium and electronic equipment
US11625900B2 (en) Broker for instancing
CN114022605A (en) Map processing method and device, storage medium and electronic equipment
CN112258621B (en) Method for observing three-dimensional rendering two-dimensional animation in real time
WO2023159595A1 (en) Method and device for constructing and configuring three-dimensional space scene model, and computer program product
US11393180B2 (en) Applying non-destructive edits to nested instances for efficient rendering
CN113409465A (en) Method and device for generating hair model, storage medium and electronic equipment
Giertsen et al. An open system for 3D visualisation and animation of geographic information
Hempe Bridging the gap between rendering and simulation frameworks: concepts, approaches and applications for modern multi-domain VR simulation systems
CN117011492B (en) Image rendering method and device, electronic equipment and storage medium
Gee 3D in Photoshop: the ultimate guide for creative professionals
He Intelligent Innovative Design of Indoor VR Based on Machine Vision
CN115423903A (en) Animation generation method and apparatus, medium, and electronic device
CN116912388A (en) Mapping processing method, mapping processing device, computer readable storage medium and electronic device
CN117218271A (en) Dough sheet generation method and device, storage medium and electronic equipment
Liu et al. Design of digital intelligent factory software based on digital twin
CN117351126A (en) Method and device for generating special effects of rain and snow in virtual scene and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination