CN112354187A - Fog dispersal system based on GPU and fog dispersal generation method - Google Patents

Fog dispersal system based on GPU and fog dispersal generation method Download PDF

Info

Publication number
CN112354187A
CN112354187A CN202011261399.0A CN202011261399A CN112354187A CN 112354187 A CN112354187 A CN 112354187A CN 202011261399 A CN202011261399 A CN 202011261399A CN 112354187 A CN112354187 A CN 112354187A
Authority
CN
China
Prior art keywords
fog
map
unlocking
module
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011261399.0A
Other languages
Chinese (zh)
Inventor
王屹
张一帆
郑宇华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Jianxin Interactive Entertainment Co ltd
Original Assignee
Zhuhai Jianxin Interactive Entertainment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Jianxin Interactive Entertainment Co ltd filed Critical Zhuhai Jianxin Interactive Entertainment Co ltd
Priority to CN202011261399.0A priority Critical patent/CN112354187A/en
Publication of CN112354187A publication Critical patent/CN112354187A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a fog dispersal system and a fog dispersal generating method based on a GPU, wherein the system comprises: the map obtaining module is used for obtaining the fog-masking block identification according to the map coordinate and obtaining the unlocking condition corresponding to the fog-masking block identification and the route of the fog-masking map through the resource allocation table; the chartlet processing module is used for obtaining a fog unlocking state corresponding to the fog block identification through the unlocking condition, reading a route of the fog chartlet according to the fog unlocking state, transmitting the route to the shader, and processing the fog chartlet through the shader to obtain rendering chartlet data; and the map rendering module is used for rendering according to the rendering map data to obtain the fog scene rendering effect. According to the invention, the unlocking conditions and the corresponding mapping paths of the fog are set through the configuration table, and the shader is used for performing calculation processing by using the GPU, so that the fog display efficiency is greatly improved, and the blockage is effectively prevented.

Description

Fog dispersal system based on GPU and fog dispersal generation method
Technical Field
The invention relates to the technical field of games, in particular to a fog dispersal system and a fog dispersal generation method based on a GPU.
Background
At present, a plurality of games on the market can make 'maze fog', and the specific use is as follows: in the novice, the areas which are not explored by the player are shown to be shielded by a layer of gray misty fog, so that the player can explore the areas and then unlock the areas gradually.
In the traditional manufacturing mode of the fog dispersal, a graph is used for representing data of the terrain, when the data of the terrain is unlocked, a CPU calculates pixels corresponding to the graph, modifies corresponding pixel values, and renders corresponding effects for a shielding value and an opening value corresponding to the graph respectively: displaying a scene or displaying a fog-masking effect. However, after the chartlet generated by the CPU, the GPU needs to be uploaded, which is performed on the main thread, so that the efficiency is low, and even causes the screen to be stuck.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides a fog dispersal system based on a GPU, which can improve the rendering efficiency.
The invention further provides a defogging fog generating method corresponding to the defogging fog system based on the GPU.
According to a first aspect of the invention, a GPU-based defogging system comprises: the map obtaining module is used for obtaining a fog-masking block identifier according to the map coordinate, and obtaining an unlocking condition corresponding to the fog-masking block identifier and a route of a fog-masking map through a resource allocation table; the chartlet processing module is used for obtaining a fog unlocking state corresponding to the fog block identification through the unlocking condition, reading a path of the fog chartlet according to the fog unlocking state, transmitting the path to a shader, and processing the fog chartlet through the shader to obtain rendering chartlet data; and the map rendering module is used for rendering according to the rendering map data to obtain a fog scene rendering effect.
The fog dispersal system based on the GPU has the following beneficial effects: the unlocking conditions and the corresponding mapping paths of the fog are set through the configuration table, and the shader is used for performing calculation processing through the GPU, so that the fog display efficiency is improved greatly, and the blockage is effectively prevented.
According to some embodiments of the invention, the map processing module comprises: the unlocking judgment module is used for judging whether the game role reaches an unlocking condition and is unlocked according to the unlocking condition to obtain the camouflage unlocking state; the fog-masking display module is used for reading the path of the fog-masking map and transmitting the path to the shader, and the shader is used for processing the fog-masking map to obtain the rendering map data; the fog disappearing module is used for controlling the fog effect concentration of the fog from high to low until the fog effect concentration is zero; and the scene display module is used for obtaining the rendering map data according to the scene map.
According to some embodiments of the invention, further comprising: the hardware detection module is used for detecting the hardware configuration of the mobile equipment of the user and determining the hardware level of the mobile equipment according to the hardware configuration; and the optimization display module is used for reading the defogging function configuration table according to the hardware level, setting the canvas mapping size and the block mapping size of the defogging, and starting or closing the corresponding functional effect of the defogging through the shader keywords.
According to some embodiments of the invention, further comprising: the fog dispersal design module is used for providing an interactive interface, displaying a scene map according to a certain proportion, enabling a user to identify a fog dispersal area in the scene map, appointing the fog dispersal map, and generating fog dispersal information in the scene map according to a user operation result, wherein the fog dispersal information comprises a fog dispersal block identification, a fog dispersal area and a path of the fog dispersal map.
According to some embodiments of the invention, further comprising: and the resource allocation module is used for allocating the corresponding unlocking condition, the path of the fog pattern, the NPC generated after unlocking and the article for the fog-lost block identification and storing the unlocking condition, the path of the fog-lost map, the NPC generated after unlocking and the article into the resource allocation table.
According to some embodiments of the invention, further comprising: the edge fuzzy module is used for processing the edge of the fog-masking mapping based on Gaussian fuzzy to generate a first mapping; and the edge tuning module is used for providing an interactive interface, tuning the first map and generating a second map.
According to some embodiments of the invention, further comprising: the high fog module is used for controlling the depth of the color of the dense fog according to the distance from the ground; and the flow fog effect module is used for adding a layer of UV flow effect on the rendering effect of the fog pattern.
According to a second aspect embodiment of the present invention, a method of generating a mist includes the steps of: s100, acquiring a fog-masking block identifier according to a map coordinate, and acquiring an unlocking condition and a fog-masking mapping path corresponding to the fog-masking block identifier through a resource allocation table; s200, obtaining a fog unlocking state corresponding to the fog block identification through the unlocking condition, reading a route of the fog map according to the fog unlocking state, transmitting the route to a shader, and processing the fog map through the shader to obtain rendering map data; and S300, rendering by the GPU according to the rendering map data to obtain a fog-disturbing scene rendering effect.
The method for generating the camouflage fog has the following beneficial effects: the unlocking conditions and the corresponding mapping paths of the fog are set through the configuration table, and the shader is used for performing calculation processing through the GPU, so that the fog display efficiency is improved greatly, and the blockage is effectively prevented.
According to some embodiments of the invention, said step S200 comprises: s210, judging whether the game role reaches an unlocking condition and is unlocked according to the unlocking condition to obtain the maze unlocking state; s220, if the fog unlocking state is not unlocked, reading a path of the fog map and transmitting the path to a shader, and processing the fog map through the shader to obtain rendering map data; and S230, if the fog unlocking state is unlocking for the first time, the shader adjusts fog effect concentration of the fog according to time from high to low until the fog effect concentration is zero, and a scene map is displayed.
According to some embodiments of the invention, further comprising: detecting the hardware configuration of the mobile equipment of a user, and determining the hardware level of the mobile equipment according to the hardware configuration; and reading the defogging function configuration table according to the hardware level, setting the canvas mapping size and the block mapping size of the defogging, and starting or closing the corresponding functional effect of the defogging through the shader keywords.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a block schematic diagram of modules of a system of an embodiment of the invention;
FIG. 2 is a detailed block diagram of a system according to an embodiment of the present invention;
FIG. 3 is a flow chart illustrating a method according to an embodiment of the present invention.
Reference numerals:
the map rendering module comprises a map obtaining module 100, a map processing module 200 and a map rendering module 300.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, the meaning of a plurality of means is one or more, the meaning of a plurality of means is two or more, and more than, less than, more than, etc. are understood as excluding the present number, and more than, less than, etc. are understood as including the present number. If the first and second are described for the purpose of distinguishing technical features, they are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
Referring to fig. 1, a system of an embodiment of the present invention includes: the map obtaining module 100 is configured to obtain a fog block identifier according to a map coordinate, and obtain an unlocking condition and a route of a fog map corresponding to the fog block identifier through a resource allocation table; the map processing module 200 is configured to obtain a fog unlocking state corresponding to the fog block identifier through the unlocking condition, read a route of the fog map according to the fog unlocking state, transmit the route to a shader, and process the fog map through the shader to obtain rendered map data; and the map rendering module 300 is configured to render according to the rendered map data to obtain a fog scene rendering effect.
In the system of the embodiment of the present invention, referring to fig. 2, the method further includes: and the fog-mixing design module is used for providing an interactive interface, displaying the scene map in a map form according to a certain proportion, allowing a user to identify a fog-mixing area and designate a fog-mixing map in the scene map, and generating fog-mixing information in the scene map according to a user operation result. The user can set the position, shape and size of a fog region and the path of a fog map to be used in the fog design module, the fog design module automatically generates fog region identification, fog region information (including position, shape, size and the like) of the region in a scene map is generated according to the key point coordinates of the fog region designed in the map, and the fog region identification is used as an index to store the fog region information and the map path corresponding to the fog; the maze area identification and the chartlet path can be stored in a resource configuration table corresponding to the maze, and the maze area information can be stored in a maze file corresponding to the scene map, such as: exporting a configuration file in a lua format, and storing the number of the blocks of the fog blocking areas, the map resources corresponding to each block and the current unlocking state in the map; the unlocked state of the fogged area may be stored in the server, and the local configuration file may be updated by requesting the fogged unlocked state data of the server. The fog-masking design module is generally used for game scene map design. The system of the embodiment of the invention further comprises: and the resource allocation module is used for allocating corresponding unlocking conditions, paths of the fog map, NPCs generated after unlocking and articles for the fog block identifications and storing the unlocking conditions, the paths of the fog map, the NPCs and the articles into a resource allocation table. In one embodiment of the invention, the contents included in the resource configuration table are as follows.
And Id is a fog pattern block identification and a unique identification which are not repeated.
Desc, description of the block of fog, filled or unfilled.
Path, the resource Path corresponding to the fog-lost block needs resources.
Monster _ id, the list of Monster put in the fog-masking block is related to the task, and no Monster is put in if the Monster _ id is not filled in.
State is the initial State of the fog; 0 means not open, 1 means not unlocked, condition is required to be fulfilled for unlocking, and 2 means default unlocking.
Condition the unlocking Condition of fog dispersal can be mapped to an unlocking Condition table through an index, and the table describes the specific requirements of each unlocking Condition.
Chest: the field describes the treasure box id arranged in the fog block, and can be opened by clicking, and if the treasure box is designed with a reward, the reward is obtained.
Position this field describes the location of the Chest field treasure box.
And (3) the other: the field can be expanded according to the playing requirement.
In the embodiment of the present invention, referring to fig. 2, the hardware configuration of the mobile device of the user is further detected by the hardware detection module, and the hardware level of the mobile device is determined according to the hardware configuration; and then, reading the defogging function configuration table according to the hardware level by optimizing the display module, setting the canvas mapping size and the block mapping size of the defogging, and starting or closing the corresponding functional effect of the defogging through the shader keywords. The map acquisition module 100 reads a path of the fog map from the resource allocation table, and transmits the path to the map processing module 200 for processing. The map processing module 200 includes: the unlocking judgment module is used for judging whether the game role reaches the unlocking condition and is unlocked according to the unlocking condition to obtain a fog unlocking state; the fog masking display module is used for reading the route of the fog masking map, transmitting the route to the shader, and processing the fog masking map through the shader to obtain rendering map data; the fog disappearing module is used for controlling the fog effect concentration of the fog from high to low based on preset time until the fog effect concentration is zero; and the scene display module is used for obtaining rendering map data according to the scene map. And if the fog unlocking state is default unlocking or the current unlocking state of the current game role for the fog is unlocked, directly displaying the scene through the scene display module. If the current unlocking state of the current game role is not unlocked, but the unlocking condition of the fog is achieved, the unlocking is performed for the first time, the fog is controlled to be reduced from the current fog effect concentration to 0 through the fog disappearing module, the gradual dissipation of the fog is shown, the current scene is displayed, and the NPC and the articles appearing after the unlocking are refreshed according to the playing method setting in the resource configuration table. And if the current unlocking state of the current game role is not unlocked and the unlocking condition of the fog is not met, rendering the fog of the area through the fog display module.
In the Unity engine embodiment, the display process of the fog is as follows: 1. in an application program layer, loading a specified tile map by using a C # language built in a Unity engine, and converting the tile map into a map object identified by the Unity engine; 2. creating a canvas at the application layer, called 'RenderTexture' in the Unity engine, which is a target rendering canvas; 3. creating a material ball and appointing a Shader for processing the material ball, wherein the Shader realizes the computation function of the fog; and sets the required input parameters, such as the loaded map mentioned in step 1, and other parameters required by Shader, such as: macro definition for controlling the fog rendering effect, and the like; 4. calculating the mapping data of the material ball through the shader operation mapping operation according to the specified material ball by using an interface provided by an engine, and copying the mapping data to a specified target mapping or frame cache; 5. and (4) the engine takes out the data calculated in the step (4) from the frame buffer for rendering.
The system in the embodiment of the invention further comprises a module for adjusting the fog-masking effect, such as: the edge fuzzy module is used for processing the edge of the fog-masking mapping based on Gaussian blur to generate a first mapping; and the edge tuning module is used for providing an interactive interface, tuning the first map and generating a second map. The edge of the camouflage painting is automatically processed through an edge blurring module, so that the edge is preliminarily blurred, and the workload of art workers is reduced; and through the edge adjusting and optimizing module, the first sticker is adjusted and optimized by the art personnel, so that the camouflage edge is more natural. Further comprising: the high fog module is used for controlling the depth of the color of the dense fog according to the distance from the ground; and the flow fog effect module is used for adding a layer of UV flow effect on the rendering effect of the fog pattern. The expressive force of the dense fog can be increased by the high fog module and the flowing fog effect module, so that the dense fog is more vivid and natural, and meanwhile, the relative effect function of automatic shutdown of equipment with too low hardware grade can be prevented according to the hardware grade of the equipment used by a user, and the smooth picture is preferentially ensured.
The method of the embodiment of the invention, referring to fig. 3, comprises the following steps: s100, acquiring a fog-masking block identifier according to the map coordinate, and acquiring an unlocking condition and a route of a fog-masking map corresponding to the fog-masking block identifier through a resource allocation table; s200, obtaining a fog unlocking state corresponding to the fog block identification through the unlocking condition, reading a route of a fog map according to the fog unlocking state, transmitting the route to a shader, and processing the fog map through the shader to obtain rendering map data; and S300, rendering by the GPU according to the rendering map data to obtain a fog-disturbing scene rendering effect.
Step S200 of the method of an embodiment of the present invention comprises: 210, judging whether the game role reaches an unlocking condition and is unlocked according to the unlocking condition to obtain a fog unlocking state; s220, if the fog unlocking state is not unlocked, reading a route of the fog map and transmitting the route to a shader, and processing the fog map through the shader to obtain rendering map data; and S230, if the fog unlocking state is unlocking for the first time, the shader adjusts the fog effect concentration of the fog according to time from high to low until the fog effect concentration is zero, and the scene map is displayed.
The method of the embodiment of the invention further comprises the following steps: detecting the hardware configuration of the mobile equipment of a user, and determining the hardware level of the mobile equipment according to the hardware configuration; and reading the defogging function configuration table according to the hardware level, setting the canvas mapping size and the block mapping size of the defogging, and starting or closing the corresponding functional effect of the defogging through the shader keywords. By adjusting the function of the fog dispersal, the high, medium and low machines are adapted, so that the machines can run smoothly, and in the method of the embodiment of the invention, the adjustable function of the fog dispersal comprises the following steps: (1) the canvas map size of the fog, such as 1024 × 1024 for high-middle-end mobile phones and 512 × 512 for low-end mobile phones; (2) size of the fog block map: using 256 × 256 size maps as maps; (3) effect of high fog of the mist: switching on and off through the shader keyword, switching on the high-end machine, and switching off the middle-low end; (4) cloud effect of flow over the mist: switching on and off through the shader keyword, switching on the high-end machine, and switching off the middle-low end; (5) render textformat bgraw 32 is modified to render textformat rgfloat using the format of the maze canvas map.
Although specific embodiments have been described herein, those of ordinary skill in the art will recognize that many other modifications or alternative embodiments are equally within the scope of this disclosure. For example, any of the functions and/or processing capabilities described in connection with a particular device or component may be performed by any other device or component. In addition, while various illustrative implementations and architectures have been described in accordance with embodiments of the present disclosure, those of ordinary skill in the art will recognize that many other modifications of the illustrative implementations and architectures described herein are also within the scope of the present disclosure.
Certain aspects of the present disclosure are described above with reference to block diagrams and flowchart illustrations of systems, methods, systems, and/or computer program products according to example embodiments. It will be understood that one or more blocks of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by executing computer-executable program instructions. Also, according to some embodiments, some blocks of the block diagrams and flow diagrams may not necessarily be performed in the order shown, or may not necessarily be performed in their entirety. In addition, additional components and/or operations beyond those shown in the block diagrams and flow diagrams may be present in certain embodiments.
Accordingly, blocks of the block diagrams and flowchart illustrations support combinations of means for performing the specified functions, combinations of elements or steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions, elements or steps, or combinations of special purpose hardware and computer instructions.
Program modules, applications, etc. described herein may include one or more software components, including, for example, software objects, methods, data structures, etc. Each such software component may include computer-executable instructions that, in response to execution, cause at least a portion of the functionality described herein (e.g., one or more operations of the illustrative methods described herein) to be performed.
The software components may be encoded in any of a variety of programming languages. An illustrative programming language may be a low-level programming language, such as assembly language associated with a particular hardware architecture and/or operating system platform. Software components that include assembly language instructions may need to be converted by an assembler program into executable machine code prior to execution by a hardware architecture and/or platform. Another exemplary programming language may be a higher level programming language, which may be portable across a variety of architectures. Software components that include higher level programming languages may need to be converted to an intermediate representation by an interpreter or compiler before execution. Other examples of programming languages include, but are not limited to, a macro language, a shell or command language, a job control language, a scripting language, a database query or search language, or a report writing language. In one or more exemplary embodiments, a software component containing instructions of one of the above programming language examples may be executed directly by an operating system or other software component without first being converted to another form.
The software components may be stored as files or other data storage constructs. Software components of similar types or related functionality may be stored together, such as in a particular directory, folder, or library. Software components may be static (e.g., preset or fixed) or dynamic (e.g., created or modified at execution time).
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the above embodiments, and various changes can be made within the knowledge of those skilled in the art without departing from the gist of the present invention.

Claims (10)

1. A GPU-based fog generating system, comprising:
the map obtaining module is used for obtaining a fog-masking block identifier according to the map coordinate, and obtaining an unlocking condition corresponding to the fog-masking block identifier and a route of a fog-masking map through a resource allocation table;
the chartlet processing module is used for obtaining a fog unlocking state corresponding to the fog block identification through the unlocking condition, reading a path of the fog chartlet according to the fog unlocking state, transmitting the path to a shader, and processing the fog chartlet through the shader to obtain rendering chartlet data;
and the map rendering module is used for rendering according to the rendering map data to obtain a fog scene rendering effect.
2. The GPU-based defogging system according to claim 1, wherein said map processing module comprises:
the unlocking judgment module is used for judging whether the game role reaches an unlocking condition and is unlocked according to the unlocking condition to obtain the camouflage unlocking state;
the fog-masking display module is used for reading the path of the fog-masking map and transmitting the path to the shader, and the shader is used for processing the fog-masking map to obtain the rendering map data;
the fog disappearing module is used for controlling the fog effect concentration of the fog from high to low based on preset time until the fog effect concentration is zero;
and the scene display module is used for obtaining the rendering map data according to the scene map.
3. The GPU-based defogging system according to claim 1, further comprising:
the hardware detection module is used for detecting the hardware configuration of the mobile equipment of the user and determining the hardware level of the mobile equipment according to the hardware configuration;
and the optimization display module is used for reading the defogging function configuration table according to the hardware level, setting the canvas mapping size and the block mapping size of the defogging, and starting or closing the corresponding functional effect of the defogging through the shader keywords.
4. The GPU-based defogging system according to claim 1, further comprising: the fog dispersal design module is used for providing an interactive interface, displaying a scene map according to a certain proportion, enabling a user to identify a fog dispersal area in the scene map, appointing the fog dispersal map, and generating fog dispersal information in the scene map according to a user operation result, wherein the fog dispersal information comprises a fog dispersal block identification, a fog dispersal area and a path of the fog dispersal map.
5. The GPU-based defogging system according to claim 1, further comprising: and the resource allocation module is used for allocating the corresponding unlocking condition, the path of the fog pattern, the NPC generated after unlocking and the article for the fog-lost block identification and storing the unlocking condition, the path of the fog-lost map, the NPC generated after unlocking and the article into the resource allocation table.
6. The GPU-based defogging system according to claim 1, further comprising:
the edge fuzzy module is used for processing the edge of the fog-masking mapping based on Gaussian fuzzy to generate a first mapping;
and the edge tuning module is used for providing an interactive interface, tuning the first map and generating a second map.
7. The GPU-based defogging system according to claim 1, further comprising:
the high fog module is used for controlling the depth of the color of the dense fog according to the distance from the ground;
and the flow fog effect module is used for adding a layer of UV flow effect on the rendering effect of the fog pattern.
8. A fog dispersal generation method based on a GPU is characterized by comprising the following steps:
s100, acquiring a fog-masking block identifier according to a map coordinate, and acquiring an unlocking condition and a fog-masking mapping path corresponding to the fog-masking block identifier through a resource allocation table;
s200, obtaining a fog unlocking state corresponding to the fog block identification through the unlocking condition, reading a route of the fog map according to the fog unlocking state, transmitting the route to a shader, and processing the fog map through the shader to obtain rendering map data;
and S300, rendering by the GPU according to the rendering map data to obtain a fog-disturbing scene rendering effect.
9. A GPU-based defogging method according to claim 8, wherein said step S200 comprises:
s210, judging whether the game role reaches an unlocking condition and is unlocked according to the unlocking condition to obtain the maze unlocking state;
s220, if the fog unlocking state is not unlocked, reading a path of the fog map and transmitting the path to a shader, and processing the fog map through the shader to obtain rendering map data;
and S230, if the fog unlocking state is unlocking for the first time, the shader adjusts fog effect concentration of the fog according to time from high to low until the fog effect concentration is zero, and a scene map is displayed.
10. The GPU-based defogging method according to claim 8, further comprising:
detecting the hardware configuration of the mobile equipment of a user, and determining the hardware level of the mobile equipment according to the hardware configuration;
and reading the defogging function configuration table according to the hardware level, setting the canvas mapping size and the block mapping size of the defogging, and starting or closing the corresponding functional effect of the defogging through the shader keywords.
CN202011261399.0A 2020-11-12 2020-11-12 Fog dispersal system based on GPU and fog dispersal generation method Withdrawn CN112354187A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011261399.0A CN112354187A (en) 2020-11-12 2020-11-12 Fog dispersal system based on GPU and fog dispersal generation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011261399.0A CN112354187A (en) 2020-11-12 2020-11-12 Fog dispersal system based on GPU and fog dispersal generation method

Publications (1)

Publication Number Publication Date
CN112354187A true CN112354187A (en) 2021-02-12

Family

ID=74515406

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011261399.0A Withdrawn CN112354187A (en) 2020-11-12 2020-11-12 Fog dispersal system based on GPU and fog dispersal generation method

Country Status (1)

Country Link
CN (1) CN112354187A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112915536A (en) * 2021-04-02 2021-06-08 网易(杭州)网络有限公司 Rendering method and device of virtual model
CN113345068A (en) * 2021-06-10 2021-09-03 西安恒歌数码科技有限责任公司 War fog-lost drawing method and system based on osgEarth
CN113332720A (en) * 2021-05-27 2021-09-03 网易(杭州)网络有限公司 Game map display method and device, computer equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112915536A (en) * 2021-04-02 2021-06-08 网易(杭州)网络有限公司 Rendering method and device of virtual model
CN113332720A (en) * 2021-05-27 2021-09-03 网易(杭州)网络有限公司 Game map display method and device, computer equipment and storage medium
CN113345068A (en) * 2021-06-10 2021-09-03 西安恒歌数码科技有限责任公司 War fog-lost drawing method and system based on osgEarth
CN113345068B (en) * 2021-06-10 2023-12-05 西安恒歌数码科技有限责任公司 Method and system for drawing war camouflage based on osgEarth

Similar Documents

Publication Publication Date Title
CN112354187A (en) Fog dispersal system based on GPU and fog dispersal generation method
KR101732288B1 (en) Sprite graphics rendering system
US8022950B2 (en) Stochastic culling of rays with increased depth of recursion
US8085267B2 (en) Stochastic addition of rays in a ray tracing image processing system
CN109260708A (en) Map rendering method, device and computer equipment
MXPA06012368A (en) Integration of three dimensional scene hierarchy into two dimensional compositing system.
CN101553771A (en) Rendering hypertext markup language content
US20080122846A1 (en) Adaptive Ray Data Reorder for Optimized Ray Temporal Locality
US20090267960A1 (en) Color Modification of Objects in a Virtual Universe
CN112717414B (en) Game scene editing method and device, electronic equipment and storage medium
CN111400024A (en) Resource calling method and device in rendering process and rendering engine
WO2015037169A1 (en) Rendering device
US20190114819A1 (en) Dimensional content surface rendering
JP2008305347A (en) Method and device for generating interference discrimination information
CN110853122A (en) Animation generation method, animation generation device and storage medium
CN116778038A (en) Animation editor and animation design method based on three-dimensional map visualization platform
CN115970275A (en) Projection processing method and device for virtual object, storage medium and electronic equipment
US11302052B2 (en) Forced contiguous data for execution of evaluation logic used in animation control
CN113192173B (en) Image processing method and device of three-dimensional scene and electronic equipment
CN115167940A (en) 3D file loading method and device
CN111340949B (en) Modeling method, computer device and storage medium for 3D virtual environment
CN110827400B (en) Method and device for generating model of object in three-dimensional scene and terminal
CN113064539A (en) Special effect control method and device, electronic equipment and storage medium
KR100536552B1 (en) Method for offering multiview and recordable media recording programs for enabling the method
CN111402348A (en) Method and device for forming illumination effect and rendering engine

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210212

WW01 Invention patent application withdrawn after publication