CN109925715B - Virtual water area generation method and device and terminal - Google Patents

Virtual water area generation method and device and terminal Download PDF

Info

Publication number
CN109925715B
CN109925715B CN201910084328.9A CN201910084328A CN109925715B CN 109925715 B CN109925715 B CN 109925715B CN 201910084328 A CN201910084328 A CN 201910084328A CN 109925715 B CN109925715 B CN 109925715B
Authority
CN
China
Prior art keywords
water area
virtual water
area
generating
bitmap
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910084328.9A
Other languages
Chinese (zh)
Other versions
CN109925715A (en
Inventor
覃飏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910084328.9A priority Critical patent/CN109925715B/en
Publication of CN109925715A publication Critical patent/CN109925715A/en
Application granted granted Critical
Publication of CN109925715B publication Critical patent/CN109925715B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a virtual water area generation method, a device and a terminal, belonging to the technical field of computer graphics, wherein the virtual water area generation method comprises the following steps: acquiring edit data of a current canvas frame; writing the editing data of the current canvas frame into a non-idle-state cache space; the non-idle state cache space stores editing data of historical canvas frames; generating a dot-matrix diagram according to the editing data of all canvas frames in the non-idle state cache space; generating a grid model of the virtual water area according to the dot matrix diagram; and rendering the grid model of the virtual water area to obtain the virtual water area. The method can directly construct the model of the virtual water area and generate the virtual water area in the editor of the Unity3D, thereby omitting the step of independently modeling by using a pull sheet in third-party modeling software and importing the game scene, greatly reducing the workload in the game making process and improving the flexibility and the efficiency of the game making.

Description

Virtual water area generation method and device and terminal
Technical Field
The invention relates to the technical field of computer graphics, in particular to a virtual water area generation method, a virtual water area generation device and a virtual water area generation terminal.
Background
Unity3D is a comprehensive game development tool developed by Unity Technologies that allows players to easily create multiple platforms of types of interactive content, such as three-dimensional video games, building visualizations, real-time three-dimensional animations, etc., and is a fully integrated professional game engine.
Since most of 3D game scenes have a virtual water area, since Unity3D in the prior art has no function of creating a three-dimensional model, when generating a virtual water area in a 3D game scene, it is generally implemented by first creating a three-dimensional model of the virtual water area in third-party modeling software, such as 3DsMax or Maya, according to features of the game scene, architectural layout, and the like, in combination with scene original images, and then importing the three-dimensional model of the virtual water area into an editor of Unity3D, and then placing the three-dimensional model of the virtual water area at a corresponding position in the game scene.
In the process of implementing the invention, the inventor finds that at least the following defects exist in the prior art:
in the related art, once the 3D game scene is slightly changed, the three-dimensional model of the virtual water area must be reconstructed in the third-party modeling software, and the reconstructed model is then re-imported into the editor of Unity 3D. However, the possibility of scene change is very high at the initial stage of 3D game development, and for scenes with many virtual water areas, the existing method for constructing the virtual water areas is not only complex in process and poor in flexibility, but also large in workload and low in efficiency.
Disclosure of Invention
In order to solve the problems in the prior art, embodiments of the present invention provide a virtual water area generation method, apparatus, and terminal. The technical scheme is as follows:
in one aspect, a virtual water area generation method is provided, and the method includes:
acquiring edit data of a current canvas frame;
writing the editing data of the current canvas frame into a non-idle-state cache space; the non-idle state cache space stores editing data of historical canvas frames;
generating a dot-matrix diagram according to the editing data of all canvas frames in the non-idle state cache space;
generating a grid model of the virtual water area according to the dot matrix diagram;
and rendering the grid model of the virtual water area to obtain the virtual water area.
In another aspect, there is provided a virtual water area generation apparatus, the apparatus including:
the acquisition module is used for acquiring the editing data of the current canvas frame;
the first writing module is used for writing the editing data of the current canvas frame into a non-idle-state cache space; the non-idle state cache space stores editing data of historical canvas frames;
the first generation module is used for generating a dot-matrix diagram according to the editing data of all canvas frames in the non-idle state cache space;
the second generation module is used for generating a grid model of the virtual water area according to the dot-matrix diagram;
and the rendering module is used for rendering the grid model of the virtual water area to obtain the virtual water area.
In another aspect, a terminal is provided, including:
a processor adapted to implement one or more instructions; and the number of the first and second groups,
a memory storing one or more instructions adapted to be loaded by the processor and to execute the virtual water area generation method.
In another aspect, a computer storage medium is provided, which stores computer program instructions that, when executed, implement the virtual water area generation method described above.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
according to the embodiment of the invention, the editing data of the current canvas frame is obtained, the editing data of the current canvas frame is written into the non-idle-state cache space, the editing data of the historical canvas frame is stored in the non-idle-state cache space, the dot-matrix diagram is generated according to the editing data of all the canvas frames in the non-idle-state cache space, the grid model of the virtual water area is generated according to the dot-matrix diagram, the grid model of the virtual water area is rendered, and the virtual water area is obtained, so that the model construction of the virtual water area and the generation of the virtual water area can be directly carried out in the editor of Unity3D, the step of independently modeling a dough sheet in third-party modeling software and then importing the game scene of Unity3D is omitted, the work load of art in the game manufacturing process is greatly reduced, and the flexibility and the efficiency of game manufacturing are improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a virtual water area generation method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of another virtual water area generation method according to an embodiment of the present invention;
fig. 3(a) is a schematic diagram of a bitmap provided by an embodiment of the present invention; fig. 3(b) is a schematic diagram of a bitmap corresponding to the bitmap of fig. 3(a) provided by the embodiment of the present invention; fig. 3(c) is a schematic diagram of a mesh model of a virtual water area corresponding to the lattice diagram of fig. 3(b) provided by the embodiment of the present invention;
fig. 4 is a schematic flowchart of a method for generating a bitmap according to an embodiment of the present invention;
fig. 5(a) is a schematic view of a scene of a virtual water area to be generated according to an embodiment of the present invention;
fig. 5(b) is a schematic diagram of a virtual water area generating process provided by the embodiment of the invention; fig. 5(c) is a schematic diagram of a generated virtual water area provided by the embodiment of the invention;
fig. 6 is a schematic structural diagram of a virtual water area generating apparatus according to an embodiment of the present invention;
FIG. 7 is a schematic view of another virtual water area generating apparatus according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a first generation module provided in the embodiment of the present invention;
fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Referring to fig. 1, a flow chart of a virtual water area generating method according to an embodiment of the invention is shown. It should be noted that the virtual water area generating method according to the embodiment of the present invention is applicable to the virtual water area generating apparatus according to the embodiment of the present invention, and the virtual water area generating apparatus may be configured in a Unity3D editor in a terminal, where the terminal may be a hardware device having various operating systems, such as a mobile phone, a tablet computer, a palmtop computer, and a personal digital assistant.
Further, the present specification provides method steps as described in the examples or flowcharts, but may include more or fewer steps based on routine or non-inventive labor. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. In actual system or end product execution, sequential execution or parallel execution (e.g., parallel processor or multi-threaded environment) may be possible according to the embodiments or methods shown in the figures. Specifically, as shown in fig. 1, the method includes:
s101, acquiring editing data of the current canvas frame.
In this specification embodiment, the editor of Unity3D may provide an editing tool for editing a virtual water area in a 3D game scene, which may be, but is not limited to, a brush, a dropper, and the like.
When a user creates a virtual water area in a 3D game scene by using an editor (hereinafter, simply referred to as an editor) of Unity3D, the editing tool may be activated by clicking an input tool such as a mouse, and then moved in a canvas of the target scene to perform a virtual water area generating operation. As the editing tool moves within the canvas of the target scene, the editor may retrieve the editing data for the current canvas frame.
It should be noted that, after the editing tool is activated, the user may set parameter information of the editing tool, for example, parameters such as a size of the editing tool, a flow rate of the editing tool, and a resolution of the editing tool may be set. The editing data of the current canvas frame may include parameter information of an editing tool in the current canvas frame and intersection position information of the editing tool and the canvas.
S103, writing the editing data of the current canvas frame into a non-idle buffer space; and editing data of historical canvas frames are stored in the buffer space in the non-idle state.
In this embodiment, as shown in fig. 2, before acquiring the edit data of the current canvas frame, the editor may create at least two buffer spaces in response to a virtual water area generation request initiated by a user, where the states of the buffer spaces include an idle state and a non-idle state. The idle state indicates that the cache space is empty and no data is stored; the non-idle state indicates that the buffer space is not empty and stores data, and specifically, the buffer space in the non-idle state stores editing data of a historical canvas frame, that is, the buffer space in the non-idle state stores editing data of all canvas frames before the current canvas frame.
And S105, generating a dot-matrix diagram according to the editing data of all the canvas frames in the non-idle state cache space.
The bitmap represents the distribution of the water surface vertex mesh, and specifically, as shown in fig. 2, a bitmap may be generated according to edit data of all canvas frames in a non-idle buffer space, and then the bitmap may be generated according to the bitmap.
In a specific embodiment, the editing data of the canvas frames includes parameter information of an editing tool corresponding to each canvas frame and intersection position information of the editing tool and the canvas. The parameter information of the editing tool is preset parameter information of the editing tool in each canvas frame, and may include, for example, parameter information such as the size of the editing tool, the flow rate of the editing tool, and the resolution. The intersection position information of the editing tool and the canvas can be the position coordinates of the intersection in the canvas. Correspondingly, the generating a bitmap according to the edit data of all the canvas frames in the non-idle-state cache space includes: and filling pixels according to the parameter information of the editing tool and the intersection point position information to generate the bitmap.
Specifically, the parameter information and the intersection position information of the editing tool may be transmitted to a shader of a GPU (graphics Processing Unit), the shader of the GPU may perform filling of corresponding pixels according to the parameter information and the intersection position information of the editing tool to generate a bitmap, and then the bitmap may be written into a non-idle cache space. Of course, the parameter information of the editing tool and the intersection position information may also be transmitted to a Central Processing Unit (CPU), and the CPU may generate the bitmap.
In order to achieve precise control of the boundary of the subsequently generated virtual water area to achieve natural engagement with the surrounding scene, in an embodiment of the present specification, the parameter information of the editing tool may include a feather value. Correspondingly, when the bitmap is generated by filling pixels according to the parameter information of the editing tool and the intersection point position information, the eclosion area and the non-eclosion area can be determined according to the eclosion value and the intersection point position information, and then pixel points with different color values are respectively filled into the eclosion area and the non-eclosion area to obtain the bitmap. For example, the feathered region may be filled with gray pixels, and the non-feathered region may be filled with white pixels, as shown in the schematic diagram of the bitmap shown in fig. 3(a) in fig. 3.
In this embodiment, the generating a bitmap according to a bitmap may adopt a method shown in fig. 4, where as shown in fig. 4, the method may include:
s401, determining a feather area and a non-feather area in the bitmap according to color values of pixel points in the bitmap.
For example, a region filled with gray pixels is determined as a feathered region, and a region filled with white pixels is determined as a non-feathered region.
S403, generating a first lattice area corresponding to the eclosion area according to the first lattice density value; generating a second lattice area corresponding to the non-feather area according to the second lattice density value; wherein the first lattice density value is greater than the second lattice density value.
S405, generating a dot matrix map according to the first dot matrix area and the second dot matrix area.
In this way, the feather area of the edge in the generated bitmap has higher density of points, and the density of points in the non-feather area is relatively lower, as shown in the schematic diagram of the bitmap 3(b) in fig. 3, so that it can be ensured that the vertices of the mesh model generated based on the bitmap subsequently contain more required vertex data information, which is beneficial to realizing the effect of natural engagement.
In order to implement processing on consecutive canvas frames, avoid error reporting and improve processing efficiency, in an embodiment of the present specification, a write operation and a read operation for a non-idle state buffer space are separated, as shown in fig. 2, after a bitmap is generated according to edit data of all canvas frames in the non-idle state buffer space, the edit data of all canvas frames in the non-idle state buffer space may be written into the idle state buffer space, and the non-idle state buffer space is changed to the idle state buffer space, that is, after the edit data of all canvas frames in the non-idle state buffer space is copied to the idle state buffer space, the non-idle state buffer space may be emptied to convert the non-idle state buffer space into the idle state buffer space.
And S107, generating a grid model of the virtual water area according to the dot-matrix diagram.
In the embodiments of the present disclosure, the lattice map may be processed by using a preset triangulation algorithm to generate a triangular mesh model of the virtual water area.
The triangulation is defined as: let V be a finite set of points in the two-dimensional real number domain, edge E be a closed line segment composed of points in the set of points as end points, and E be a set of E. Then a triangulation T ═ (V, E) of the set of points V is a plan G which satisfies the condition: (1) edges in the plan view do not contain any points in the set of points, except for the endpoints; (2) there are no intersecting edges; (3) all the faces in the plan view are triangular faces, and the collection of all the triangular faces is the convex hull of the scatter set V.
Specifically, the preset triangulation algorithm may be an algorithm for forming a regular triangular mesh structure, and as shown in fig. 3(c), is a schematic diagram of a triangular mesh model of the virtual water area generated by using the algorithm for forming the regular triangular mesh structure. Of course, the preset triangulation algorithm may also be an algorithm for forming an irregular triangular mesh structure. The algorithm for forming the irregular triangular mesh structure can be a Delaunay (Delaunay) triangulation algorithm, and the distribution of the vertex points in the triangular mesh model obtained by the Delaunay triangulation algorithm is more uniform, so that the elimination of the generated sawtooth at the edge of the virtual water area is facilitated.
Specifically, if an edge E (two end points are a and b) in E is assumed, E satisfies the following condition: if there is a circle passing through two points a and b and there is no other point in the point set V in the circle, e is called Delaunay edge. If a triangulation T of the set of points V contains only Delaunay edges, the triangulation is referred to as a Delaunay triangulation. Algorithms for implementing the Delaunay triangulation may include, but are not limited to, a flanging algorithm, a point-by-point interpolation algorithm, a split-merge algorithm, a Bowyer-Watson algorithm, and the like.
And S109, rendering the grid model of the virtual water area to obtain the virtual water area.
Specifically, Vertex data of each Vertex in the mesh model of the virtual water area may be obtained, a Vertex data set is obtained and transmitted as an input to a Vertex Shader (Vertex Shader), the data is processed in the Vertex Shader, then, output data of the Vertex Shader is assembled into triangles by the graphics processor, the triangles are appropriately clipped to adapt to a user view (viewport), and the clipped triangles are processed by the rasterization unit, so that fragment (Fragments) data is obtained. Fragment data is a simple data format, each fragment data contains all the pixels of a triangle that can be displayed on the screen, and the data content of the fragment data is usually determined by a vertex shader. In fact, the vertex shader may output the vertex attribute parameters as its own output. The rasterization unit is responsible for performing color interpolation (interplate) on vertex data output by the vertex shader on a triangle, so that each pixel on the fragment data obtains a correct attribute value. Thereafter, the interpolated unprocessed fragment data is transmitted to a fragment shader (fragment shader), which determines and outputs the final color of each pixel in the fragment data, thereby completing the rendering.
It should be noted that, in addition to the fragment data received by the fragment shader, some pre-stored texture data may also be passed to the fragment shader for sampling by the fragment shader.
In order to more clearly illustrate the technical solution of the embodiment of the present invention, the method for generating a virtual water area of the present invention is described below by taking an editing tool as a brush and a Render Texture (hereinafter, abbreviated as RT) as an example, and it should be understood that the following example does not limit the present invention.
In the pond scene as shown in fig. 5(a), a virtual water area needs to be brushed out of the pond, and a user can start the function of the brush on the control panel of the water in the editor and can set parameters of the brush at the same time, where the parameters of the brush may include the size and the emergence value of the brush. At the same time, the editor will create two equal size RTs, RT1 and RT2, respectively, in response to the virtual water generation request. In this example, a new virtual water area is created in the pond, so both RT1 and RT2 are idle before starting, and at this time, either RT1 or RT2 may be assumed to be in a non-idle state, e.g., RT1 is designated as a non-idle state, after which it is determined whether each is idle or non-idle based on actual data storage in RT1 and RT 2. If the current modification is performed based on the existing virtual water area, the editor acquires the editing data of the existing virtual water area, writes the editing data of the virtual water area into RT1 or RT2, and the RT with the editing data written therein is in a non-idle state.
After that, the mouse on the screen will become a blue cursor, as shown in 5(b) of fig. 5, and the blue range represents the range of the brush, and of course, the size of the range can be adjusted at any time. By holding down the left button of the mouse without dragging the mouse, a virtual water area can be brushed out in the area where water is desired, as shown in fig. 5 (c). The brushing-out process of the virtual water area is described below by taking the first three frames as an example.
When the first frame is brushed, the editor acquires editing data (parameter information of the brush and intersection point position information of the brush and the canvas) of the first canvas frame, writes the acquired editing data of the first canvas frame into the RT1, transmits the editing data in the RT1 into a shader of the GPU for processing, and obtains a corresponding first bitmap, and the first bitmap is written into the RT 1. The editor generates a first bitmap according to the first bitmap in the RT1, generates a first virtual water area grid model according to the first bitmap, and obtains the current virtual water area by rendering the first virtual water area grid model. Further, the editor writes the edit data in RT1 into RT2 and clears RT1 after generating the first bitmap from the first bitmap in RT 1.
When the second frame is brushed, the editor acquires the editing data of the second canvas frame and writes the acquired editing data of the second canvas frame into the RT2, all the editing data (including the first canvas frame and the second canvas frame) in the RT2 are transmitted into the shader of the GPU for processing, and a corresponding second bitmap is obtained and written into the RT 2. The editor generates a second dot matrix according to a second bitmap in the RT2, generates a second virtual water area grid model according to the second dot matrix, and obtains the current virtual water area by rendering the second virtual water area grid model. Further, the editor writes the edited data in RT2 into RT1 and clears RT2 after generating the second bitmap from the second bitmap in RT 2.
When the third frame is brushed, the editor acquires the editing data of the third canvas frame, writes the acquired editing data of the third canvas frame into the RT1, transmits all the editing data (including the first canvas frame, the second canvas frame and the third canvas frame) in the RT1 into the shader of the GPU for processing, obtains a corresponding third bitmap, and writes the third bitmap into the RT 1. And the editor generates a third bitmap according to a third bitmap in the RT1, generates a third virtual water area grid model according to the third bitmap, and obtains the current virtual water area by rendering the third virtual water area grid model. Further, the editor writes the edit data in RT1 into RT2 and clears RT1 after generating the third bitmap from the third bitmap in RT 1. When the subsequent frame is refreshed, the process is analogized, and the description is omitted here.
To sum up, the embodiment of the present invention obtains the editing data of the current canvas frame, writes the editing data of the current canvas frame into the non-idle cache space, where the editing data of the historical canvas frame is stored, generates a bitmap according to the editing data of all the canvas frames in the non-idle cache space, generates a grid model of the virtual water area according to the bitmap, renders the grid model of the virtual water area, and obtains the virtual water area, so that the model construction of the virtual water area and the generation of the virtual water area can be directly performed in the editor of Unity3D, the step of independently modeling by using a drawing piece in third-party modeling software and then importing a game scene is omitted, the workload of art in the game production process is greatly reduced, and the flexibility and efficiency of game production are improved.
In addition, because the embodiment of the invention directly creates the virtual water area in the Unity3D, the effect of the virtual water area is what you see is what you get, which facilitates the modification of the virtual water area by the creator and improves the creation efficiency. And the structure of the editor of Unity3D does not need to be changed complicatedly, thus being easy to popularize and implement.
In accordance with the virtual water area generating method provided in the above-mentioned embodiments, the embodiments of the present invention also provide a virtual water area generating apparatus, and the virtual water area generating apparatus provided in the embodiments of the present invention corresponds to the virtual water area generating method provided in the above-mentioned embodiments, so the embodiments of the virtual water area generating method described above are also applicable to the virtual water area generating apparatus provided in the embodiments, and will not be described in detail in the embodiments.
Referring to fig. 6, a schematic structural diagram of a virtual water area generating apparatus according to an embodiment of the present invention is shown, and as shown in fig. 6, the apparatus may include: an acquisition module 610, a first writing module 620, a first generation module 630, a second generation module 640, and a rendering module 650, wherein,
an obtaining module 610, configured to obtain edit data of a current canvas frame;
a first writing module 620, configured to write the editing data of the current canvas frame into a non-idle buffer space; the non-idle state cache space stores editing data of historical canvas frames;
a first generating module 630, configured to generate a bitmap according to the edit data of all canvas frames in the non-idle-state cache space;
a second generating module 640, configured to generate a mesh model of the virtual water area according to the bitmap;
and a rendering module 650, configured to render the mesh model of the virtual water area to obtain the virtual water area.
Referring to fig. 7, a schematic structural diagram of another virtual water area generating apparatus according to an embodiment of the present invention is shown, and as shown in fig. 7, the apparatus may include: a creation module 660, an acquisition module 610, a first writing module 620, a first generation module 630, a second generation module 640, a rendering module 650, a second writing module 670, and a change module 680. The obtaining module 610, the first writing module 620, the first generating module 630, the second generating module 640, and the rendering module 650 may refer to the foregoing embodiment of the apparatus shown in fig. 6, and are not described herein again, wherein,
a creating module 660, configured to create at least two cache spaces in response to the virtual water area generation request, where states of the cache spaces include an idle state and a non-idle state.
And a second writing module 670, configured to write the edit data of all canvas frames in the non-idle-state buffer space into an idle-state buffer space.
A changing module 680, configured to change the non-idle buffer space to an idle buffer space.
Optionally, as shown in fig. 8, the first generating module 630 may include:
the first generating submodule 6310 is configured to generate a bitmap according to edit data of all canvas frames in the non-idle-state cache space;
and a second generation submodule 6320, which generates a bitmap according to the bitmap.
In a specific embodiment, the editing data of the canvas frame includes parameter information of an editing tool and intersection point position information of the editing tool and the canvas; correspondingly, the first generating sub-module 6310 is specifically configured to perform pixel filling according to the parameter information of the editing tool and the intersection position information, and generate the bitmap.
Optionally, the parameter information of the editing tool includes an eclosion value, and accordingly, as shown in fig. 8, the first generating sub-module 6310 may include:
a first determining module 6311, configured to determine a feathering region and a non-feathering region according to the feathering value and the intersection position information;
a pixel filling module 6312, configured to fill pixel points with different color values into the feathering region and the non-feathering region, respectively, to obtain the bitmap.
In another specific embodiment, as shown in fig. 8, the second generation submodule 6320 may include:
the second determining module 6321 is configured to determine a feather region and a non-feather region in the bitmap according to the color values of the pixels in the bitmap;
a lattice region generating module 6322, configured to generate a first lattice region corresponding to the feathering region according to a first lattice density value; generating a second lattice area corresponding to the non-feather area according to the second lattice density value; wherein the first lattice density value is greater than the second lattice density value.
A dot matrix map generating module 6323, configured to generate a dot matrix map according to the first dot matrix region and the second dot matrix region.
In another embodiment, the second generating module 640 may process the lattice map by using a preset triangulation algorithm to generate a triangular mesh model of the virtual water area.
The virtual water area generating device of the embodiment of the invention can directly construct the model of the virtual water area and generate the virtual water area in the editor of Unity3D, thereby omitting the step of independently modeling by a pull sheet in third-party modeling software and importing a game scene, greatly reducing the workload of art in the game making process and improving the flexibility and efficiency of the game making.
In addition, because the embodiment of the invention directly creates the virtual water area in the Unity3D, the effect of the virtual water area is what you see is what you get, which facilitates the modification of the virtual water area by the creator and improves the creation efficiency.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above.
Please refer to fig. 9, which is a schematic structural diagram of a terminal according to an embodiment of the present invention, the terminal is used for implementing the virtual water area generating method provided in the foregoing embodiment. Specifically, the method comprises the following steps:
the terminal 900 may include RF (Radio Frequency) circuitry 910, memory 920 including one or more computer-readable storage media, an input unit 930, a display unit 940, a video sensor 950, audio circuitry 960, a WiFi (wireless fidelity) module 970, a processor 980 including one or more processing cores, and a power supply 90. Those skilled in the art will appreciate that the terminal structure shown in fig. 9 does not constitute a limitation of the terminal, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 910 may be used for receiving and transmitting signals during a message transmission or call, and in particular, for receiving downlink information from a base station and then processing the received downlink information by the one or more processors 980; in addition, data relating to uplink is transmitted to the base station. In general, RF circuit 910 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the RF circuit 910 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (Short Messaging Service), and the like.
The memory 920 may be used to store software programs and modules, and the processor 980 performs various functional applications and data processing by operating the software programs and modules stored in the memory 920. The memory 920 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as video data, a phone book, etc.) created according to the use of the terminal 900, and the like. Further, the memory 920 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 920 may also include a memory controller to provide the processor 980 and the input unit 930 with access to the memory 920.
The input unit 930 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. Specifically, the input unit 930 may include an image input device 931 and other input devices 932. The image input device 931 may be a camera or a photoelectric scanning device. The input unit 930 may include other input devices 932 in addition to the image input device 931. In particular, other input devices 932 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 940 may be used to display information input by or provided to the user and various graphical user interfaces of the terminal 900, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 940 may include a Display panel 941, and optionally, the Display panel 941 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like.
Terminal 900 can include at least one video sensor 950 for obtaining video information of a user. Terminal 900 can also include other sensors (not shown), such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 941 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 941 and/or a backlight when the terminal 900 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the terminal 900, detailed descriptions thereof are omitted.
Video circuitry 960, speaker 961, microphone 962 may provide a video interface between a user and terminal 900. The audio circuit 960 may transmit the electrical signal converted from the received audio data to the speaker 961, and convert the electrical signal into a sound signal for output by the speaker 961; on the other hand, the microphone 962 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 960, and outputs the audio data to the processor 980 for processing, and then passes through the RF circuit 911 to be transmitted to, for example, another terminal, or outputs the audio data to the memory 920 for further processing. The audio circuit 960 may also include an earbud jack to provide communication of peripheral headphones with the terminal 900.
WiFi belongs to a short-distance wireless transmission technology, and the terminal 900 can help a user send and receive e-mails, browse webpages, access streaming media and the like through the WiFi module 970, which provides the user with wireless broadband internet access. Although fig. 9 shows the WiFi module 970, it is understood that it does not belong to the essential constitution of the terminal 900 and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 980 is a control center of the terminal 900, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the terminal 900 and processes data by operating or executing software programs and/or modules stored in the memory 920 and calling data stored in the memory 920, thereby integrally monitoring the mobile phone. Optionally, processor 980 may include one or more processing cores; preferably, the processor 980 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 980.
The terminal 900 also includes a power supply 90 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 980 via a power management system that provides management of charging, discharging, and power consumption. The power supply 90 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the terminal 900 may further include a bluetooth module or the like, which is not described in detail herein.
In this embodiment, the terminal 900 further includes a memory and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for performing the virtual water area generation method provided by the above method embodiments.
An embodiment of the present invention further provides a storage medium, which may be disposed in a terminal to store at least one instruction, at least one program, a code set, or a set of instructions related to implementing a virtual water area generating method in the method embodiment, where the at least one instruction, the at least one program, the code set, or the set of instructions may be loaded and executed by a processor of the terminal to implement the virtual water area generating method provided by the above method embodiment.
Optionally, in this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1. A virtual water area generation method, comprising:
acquiring edit data of a current canvas frame; the editing data comprises an eclosion value and intersection point position information of an editing tool and a canvas;
writing the editing data of the current canvas frame into a non-idle-state cache space; the non-idle state cache space stores editing data of historical canvas frames;
determining an eclosion area and a non-eclosion area according to the editing data of all canvas frames in the non-idle state cache space;
filling pixel points with different color values into the eclosion area and the non-eclosion area respectively to obtain a bitmap;
generating a dot-matrix diagram according to the bitmap; the dot-matrix diagram represents the distribution condition of the water surface vertex grids of the virtual water area;
generating a grid model of the virtual water area according to the dot matrix diagram;
and rendering the grid model of the virtual water area to obtain the virtual water area.
2. The virtual water generation method of claim 1, wherein before acquiring the edit data of the current canvas frame, the method further comprises:
in response to a virtual water area generation request, at least two cache spaces are created, the states of the cache spaces including an idle state and a non-idle state.
3. The virtual water area generation method according to claim 1, wherein after generating the bitmap from the edit data of all canvas frames in the buffer space in the non-idle state, the method further comprises:
writing the edit data of all canvas frames in the non-idle-state cache space into the idle-state cache space;
and changing the cache space in the non-idle state into the cache space in the idle state.
4. The virtual water area generation method according to claim 1, wherein the generating the bitmap from the bitmap comprises:
determining a feather area and a non-feather area in the bitmap according to the color values of the pixel points in the bitmap;
generating a first lattice area corresponding to the eclosion area according to the first lattice density value;
generating a second lattice area corresponding to the non-feather area according to the second lattice density value;
generating a dot matrix map according to the first dot matrix area and the second dot matrix area;
wherein the first lattice density value is greater than the second lattice density value.
5. The virtual water area generation method according to claim 1, wherein the generating a mesh model of the virtual water area from the lattice map includes:
and processing the dot matrix diagram by adopting a preset triangulation algorithm to generate a triangular mesh model of the virtual water area.
6. A virtual water area creation apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring the editing data of the current canvas frame; the editing data comprises an eclosion value and intersection point position information of an editing tool and a canvas;
the first writing module is used for writing the editing data of the current canvas frame into a non-idle-state cache space; the non-idle state cache space stores editing data of historical canvas frames;
the first generation module is used for determining an eclosion area and a non-eclosion area according to the editing data of all canvas frames in the non-idle state cache space; filling pixel points with different color values into the eclosion area and the non-eclosion area respectively to obtain a bitmap; generating a dot-matrix diagram according to the bitmap; the dot-matrix diagram represents the distribution condition of the water surface vertex grids of the virtual water area;
the second generation module is used for generating a grid model of the virtual water area according to the dot-matrix diagram;
and the rendering module is used for rendering the grid model of the virtual water area to obtain the virtual water area.
7. A terminal, comprising:
a processor adapted to implement one or more instructions; and the number of the first and second groups,
a memory storing one or more instructions adapted to be loaded by the processor and to perform the virtual water area generation method of any of claims 1-5.
8. A computer storage medium storing computer program instructions which, when executed, implement the virtual water area generation method of any one of claims 1 to 5.
CN201910084328.9A 2019-01-29 2019-01-29 Virtual water area generation method and device and terminal Active CN109925715B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910084328.9A CN109925715B (en) 2019-01-29 2019-01-29 Virtual water area generation method and device and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910084328.9A CN109925715B (en) 2019-01-29 2019-01-29 Virtual water area generation method and device and terminal

Publications (2)

Publication Number Publication Date
CN109925715A CN109925715A (en) 2019-06-25
CN109925715B true CN109925715B (en) 2021-11-16

Family

ID=66985272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910084328.9A Active CN109925715B (en) 2019-01-29 2019-01-29 Virtual water area generation method and device and terminal

Country Status (1)

Country Link
CN (1) CN109925715B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110827407B (en) * 2019-10-29 2023-08-29 广州西山居网络科技有限公司 Method and system for automatically outputting meshing resource triangle fit degree
CN113426131B (en) * 2021-07-02 2023-06-30 腾讯科技(成都)有限公司 Picture generation method and device of virtual scene, computer equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101620740A (en) * 2008-06-30 2010-01-06 北京壁虎科技有限公司 Interactive information generation method and interactive information generation system
CN102044089A (en) * 2010-09-20 2011-05-04 董福田 Method for carrying out self-adaption simplification, gradual transmission and rapid charting on three-dimensional model
CN102682472A (en) * 2012-05-07 2012-09-19 电子科技大学 Particle effect visual synthesis system and method
KR20130035485A (en) * 2011-09-30 2013-04-09 (주)시지웨이브 System for publishing 3d virtual reality moving picture and method for publishing the same
CN106952329A (en) * 2017-02-21 2017-07-14 长沙趣动文化科技有限公司 Particle effect edit methods and system based on Unity3D and NGUI
CN106971414A (en) * 2017-03-10 2017-07-21 江西省杜达菲科技有限责任公司 A kind of three-dimensional animation generation method based on deep-cycle neural network algorithm

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130063460A1 (en) * 2011-09-08 2013-03-14 Microsoft Corporation Visual shader designer
KR101911906B1 (en) * 2012-09-26 2018-10-25 에스케이플래닛 주식회사 Apparatus for 3D object creation and thereof Method
CN105279253B (en) * 2015-10-13 2018-12-14 上海联彤网络通讯技术有限公司 Promote the system and method for webpage painting canvas rendering speed
CN108989869B (en) * 2017-05-31 2020-12-11 腾讯科技(深圳)有限公司 Video picture playing method, device, equipment and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101620740A (en) * 2008-06-30 2010-01-06 北京壁虎科技有限公司 Interactive information generation method and interactive information generation system
CN102044089A (en) * 2010-09-20 2011-05-04 董福田 Method for carrying out self-adaption simplification, gradual transmission and rapid charting on three-dimensional model
KR20130035485A (en) * 2011-09-30 2013-04-09 (주)시지웨이브 System for publishing 3d virtual reality moving picture and method for publishing the same
CN102682472A (en) * 2012-05-07 2012-09-19 电子科技大学 Particle effect visual synthesis system and method
CN106952329A (en) * 2017-02-21 2017-07-14 长沙趣动文化科技有限公司 Particle effect edit methods and system based on Unity3D and NGUI
CN106971414A (en) * 2017-03-10 2017-07-21 江西省杜达菲科技有限责任公司 A kind of three-dimensional animation generation method based on deep-cycle neural network algorithm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
小型 2D 游戏引擎设计和实现;谢宾;《中国优秀硕士学位论文全文数据库 信息科技辑》;20140515;正文第4.5.1.2节 *

Also Published As

Publication number Publication date
CN109925715A (en) 2019-06-25

Similar Documents

Publication Publication Date Title
CN109087239B (en) Face image processing method and device and storage medium
US9652880B2 (en) 2D animation from a 3D mesh
JP5750103B2 (en) Animation control apparatus, animation control method, and animation control program
CN109925715B (en) Virtual water area generation method and device and terminal
CN103970518A (en) 3D rendering method and device for logic window
WO2012097556A1 (en) Three dimensional (3d) icon processing method, device and mobile terminal
CN111445563B (en) Image generation method and related device
CN112206519B (en) Method, device, storage medium and computer equipment for realizing game scene environment change
CN110502305B (en) Method and device for realizing dynamic interface and related equipment
CN110599576B (en) File rendering system, method and electronic equipment
CN116485980A (en) Virtual object rendering method, device, equipment and storage medium
CN114797109A (en) Object editing method and device, electronic equipment and storage medium
CN111031377A (en) Mobile terminal and video production method
CN110709891A (en) Virtual reality scene model establishing method and device, electronic equipment and storage medium
CN115035231A (en) Shadow baking method, shadow baking device, electronic apparatus, and storage medium
CN114820968A (en) Three-dimensional visualization method and device, robot, electronic device and storage medium
CN112308766B (en) Image data display method and device, electronic equipment and storage medium
CN114820988A (en) Three-dimensional modeling method, device, equipment and storage medium
CN114419239A (en) Automatic cutting method and device for power transmission line channel oblique photography tower model
JPH10198823A (en) Video generating device
JP2002369076A (en) Three-dimensional special effect device
CN104484769A (en) Two and three-dimensional integrated editing method used for tidal atlas of electric network
CN116934932A (en) Image rendering method, device, electronic equipment and computer readable storage medium
CN115984431A (en) Virtual object building animation generation method and device, storage medium and electronic equipment
CN115239895A (en) Mass data loading and optimized rendering method for GIS water environment 3D map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant