CN113298927A - Data processing method, application interface display device and auxiliary operation method and device - Google Patents

Data processing method, application interface display device and auxiliary operation method and device Download PDF

Info

Publication number
CN113298927A
CN113298927A CN202011508274.3A CN202011508274A CN113298927A CN 113298927 A CN113298927 A CN 113298927A CN 202011508274 A CN202011508274 A CN 202011508274A CN 113298927 A CN113298927 A CN 113298927A
Authority
CN
China
Prior art keywords
dimensional scene
texture
data
scene picture
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011508274.3A
Other languages
Chinese (zh)
Inventor
郭剑雄
王杰
洪智标
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Publication of CN113298927A publication Critical patent/CN113298927A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

A method and a device for data processing, application interface display and auxiliary operation are disclosed. Obtaining texture data and terrain data based on a two-dimensional scene picture, wherein the texture data is used for representing earth surface textures in the two-dimensional scene picture, and the terrain data is used for representing terrain in the two-dimensional scene picture; and generating a three-dimensional scene picture based on the texture data and the terrain data. Therefore, the three-dimensional scene picture is generated through the representation logic based on the two-dimensional scene picture, so that the three-dimensional scene picture with better art representation effect can be obtained without redesigning art resources by designers, the sense experience of user application is improved, meanwhile, the labor consumption can be reduced, and the labor cost is saved.

Description

Data processing method, application interface display device and auxiliary operation method and device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for data processing, application interface display, and auxiliary operation.
Background
Most of versions released in the early stage of the application program online have certain defects and deficiencies on the screen representation. To overcome the defects and shortcomings, developers are required to redesign new art resources after the application is released, which results in high labor cost.
For example, an application program that is released early when the application program comes online may be an application program (i.e., a 2D version application) for providing a version of a two-dimensional scene picture (i.e., a 2D picture). In order to further increase the user experience and enhance the artistic expression, the application interface needs to be beautified after the application program is online, such as the application program designed to provide the version of the 3D scene picture (i.e., 3D version application). If the 3D picture is redesigned, a great deal of labor force is required for designers, the design period is long, and the method is not friendly to program developers and users.
Therefore, an effective solution to the above problems is needed.
Disclosure of Invention
The object of the present disclosure is to provide an effective solution to the above problems.
According to a first aspect of the present disclosure, there is provided a data processing method including: obtaining texture data and topographic data based on the two-dimensional scene picture, wherein the texture data is used for representing the earth surface texture in the two-dimensional scene picture, and the topographic data is used for representing the topography in the two-dimensional scene picture; and generating a three-dimensional scene picture based on the texture data and the terrain data.
According to a second aspect of the present disclosure, there is provided an application interface display method, including: displaying an application interface, wherein the application interface comprises a two-dimensional scene picture; and responding to a switching request of a user, and displaying a three-dimensional scene picture generated based on the two-dimensional scene picture in the application interface.
According to a third aspect of the present disclosure, there is provided an application interface display method, including: displaying an application interface, wherein the application interface comprises a two-dimensional scene picture; responding to a switching request of a user, and judging whether a three-dimensional scene picture generated based on the two-dimensional scene picture exists or not; and if the three-dimensional scene picture exists, displaying the three-dimensional scene picture, and if the three-dimensional scene picture does not exist, generating the three-dimensional scene picture based on the two-dimensional scene picture and displaying the generated three-dimensional scene picture.
According to a fourth aspect of the present disclosure, there is provided an auxiliary operation method comprising: receiving a two-dimensional scene image input by a user; generating a three-dimensional scene image based on the two-dimensional scene image in response to a three-dimensional conversion request of a user for the two-dimensional scene image; and providing the three-dimensional scene image to a user.
According to a fifth aspect of the present disclosure, there is provided an application interface display method, including: displaying an application interface, wherein the application interface comprises a two-dimensional scene picture; responding to a switching request of a user, and generating a three-dimensional scene picture based on an adjusting parameter set by the user and the two-dimensional scene picture; and displaying the three-dimensional scene picture in the application interface.
According to a sixth aspect of the present disclosure, there is provided an application interface display method, including: displaying an application interface, wherein the application interface comprises a two-dimensional scene picture; and responding to the condition that the running environment of the application program accords with the preset condition, and generating a three-dimensional scene picture based on the two-dimensional scene picture.
According to a seventh aspect of the present disclosure, there is provided a game interface display method including: responding to the switching operation of a user for the two-dimensional plots in the game interface, and generating three-dimensional plots based on map resources of the two-dimensional plots; and displaying a game interface including the three-dimensional parcel.
According to an eighth aspect of the present disclosure, there is provided a data processing system comprising: the image processing system comprises a central processing unit and an image processing unit, wherein the central processing unit is used for generating a task aiming at image processing related to pixel-level intensive operation, sending the task to the image processing unit, and the image processing unit is used for executing the task and sending a task execution result to the central processing unit.
According to a ninth aspect of the present disclosure, there is provided a data processing apparatus comprising: the acquisition module is used for acquiring texture data and topographic data based on the two-dimensional scene picture, wherein the texture data is used for representing the surface texture in the two-dimensional scene picture, and the topographic data is used for representing the topography in the two-dimensional scene picture; and the generating module is used for generating a three-dimensional scene picture based on the texture data and the terrain data.
According to a tenth aspect of the present disclosure, there is provided an application interface display apparatus including: the display module is used for displaying an application interface, the application interface comprises a two-dimensional scene picture, and the display module is also used for responding to a switching request of a user and displaying a three-dimensional scene picture generated based on the two-dimensional scene picture in the application interface.
According to an eleventh aspect of the present disclosure, there is provided an application interface display apparatus including: the display module is used for displaying an application interface, and the application interface comprises a two-dimensional scene picture; the judging module is used for responding to a switching request of a user and judging whether a three-dimensional scene picture generated based on the two-dimensional scene picture exists or not; the display module is further configured to display the three-dimensional scene picture or the three-dimensional scene picture generated by the generation module if the three-dimensional scene picture exists.
According to a twelfth aspect of the present disclosure, there is provided an auxiliary operating device including: the receiving module is used for receiving a two-dimensional scene image input by a user; a generating module, configured to generate a three-dimensional scene image based on the two-dimensional scene image in response to a three-dimensional conversion request for the two-dimensional scene image by a user; and a providing module for providing the three-dimensional scene image to a user.
According to a thirteenth aspect of the present disclosure, there is provided an application interface display apparatus including: the display module is used for displaying an application interface, and the application interface comprises a two-dimensional scene picture; the generating module is used for responding to a switching request of a user and generating a three-dimensional scene picture based on an adjusting parameter set by the user and the two-dimensional scene picture; and the display module is also used for displaying the three-dimensional scene picture in the application interface.
According to a fourteenth aspect of the present disclosure, there is provided an application interface display apparatus including: the display module is used for displaying an application interface, and the application interface comprises a two-dimensional scene picture; and the generating module is used for responding to the condition that the running environment of the application program accords with the preset condition and generating a three-dimensional scene picture based on the two-dimensional scene picture.
According to a fifteenth aspect of the present disclosure, there is provided a game interface display device including: the generating module is used for responding to the switching operation of a user for the two-dimensional plots in the game interface and generating three-dimensional plots based on the map resources of the two-dimensional plots; and the display module is used for displaying the game interface comprising the three-dimensional plot.
According to a sixteenth aspect of the present disclosure, there is provided a computing device comprising: a processor; and a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any of the first to seventh aspects described above.
According to a seventeenth aspect of the present disclosure, there is provided a non-transitory machine-readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform the method of any of the first to seventh aspects described above.
Therefore, the three-dimensional scene picture is generated based on the texture data and the topographic data obtained by the two-dimensional scene picture, so that the three-dimensional scene picture with better art expression effect can be obtained without redesigning art resources by designers, the application sensory experience of users is improved, meanwhile, the labor consumption can be reduced, and the labor cost is saved.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
Fig. 1 shows a schematic flow diagram of a data processing method according to an embodiment of the present disclosure.
Fig. 2 shows a schematic diagram of a data processing architecture for performing the data processing method of the present disclosure.
FIG. 3 shows a schematic flow diagram for generating a three-dimensional parcel based on a two-dimensional parcel, according to one embodiment of the present disclosure.
Fig. 4 shows a schematic structural diagram of a data processing apparatus according to an embodiment of the present disclosure.
Fig. 5 shows a schematic structural diagram of an application interface display device according to an embodiment of the present disclosure.
Fig. 6 shows a schematic structural diagram of a game interface display device according to an embodiment of the present disclosure.
FIG. 7 shows a schematic structural diagram of a computing device, according to one embodiment of the present disclosure.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In order to reduce the workload of data processing while improving the defects and deficiencies of the application program in the picture representation, the present disclosure proposes that a new scene picture with better picture representation effect can be generated based on the original scene picture provided in the running process of the application program.
The picture content of the new scene picture may be identical to the picture content of the original scene picture, but the picture content is superior to the original scene picture in presentation effect.
In other words, the distribution of objects in the new scene picture can be consistent with the distribution of objects in the original scene picture, but the objects are more detailed and real.
The original scene picture may refer to a two-dimensional scene picture, and the new scene picture may refer to a three-dimensional scene picture generated based on the two-dimensional scene picture.
Therefore, the picture resources of the original scene picture (such as a two-dimensional scene picture) can be multiplexed, and a new scene picture (such as a three-dimensional scene picture) with better expression effect can be generated without redesigning the picture resources by designers.
That is, the present disclosure provides a data processing scheme that can reuse the picture resources of the original scene picture to generate a new scene picture with better presentation effect.
The principle of the data processing scheme of the present disclosure is exemplarily explained below by taking an original scene picture as a two-dimensional scene picture and a new scene picture as a three-dimensional scene picture as an example. It should be understood that the original scene picture and the new scene picture may be both a three-dimensional scene picture or a two-dimensional scene picture. That is, the present disclosure may also generate a two-dimensional (e.g., 2.5-dimensional or pseudo 3D) scene picture, which may further fit a three-dimensional expression effect, based on the two-dimensional scene picture. Alternatively, the present disclosure may generate a new three-dimensional scene screen with a more excellent expression effect based on the three-dimensional scene screen with a poor expression effect.
Fig. 1 shows a schematic flow diagram of a data processing method according to an embodiment of the present disclosure. The method shown in fig. 1 may be performed by a (back-end) server or by a (front-end) client device running an application.
Referring to fig. 1, texture data and topographic data may first be derived based on a two-dimensional scene picture, i.e., texture data and topographic data may be separated from the two-dimensional scene picture. The texture data is used to characterize surface texture in the two-dimensional scene picture, and the terrain data is used to characterize terrain in the two-dimensional scene picture.
The surface texture represented by the texture data separated from the two-dimensional scene picture is usually a two-dimensional surface texture without illumination expression effect and with rough texture expression effect.
Most of the terrain data separated from the two-dimensional scene picture can only represent terrain types at different positions in the two-dimensional scene picture, but cannot represent terrain heights.
After the texture data and the terrain data are obtained, a three-dimensional scene picture can be generated based on the texture data and the terrain data.
The three-dimensional scene picture is generated based on texture data and topographic data separated from the two-dimensional scene picture, and can represent the expression effect of scene contents in the two-dimensional scene picture in a three-dimensional scene.
As shown in fig. 1, a texture map that can represent the effect of representing a surface texture in a three-dimensional scene may be obtained based on texture data, a height field that can represent the height of a terrain in the three-dimensional scene may be obtained based on terrain data, and a three-dimensional scene picture may be generated based on the texture map and the height field. Wherein the steps of texture mapping and height field acquisition may be performed in parallel.
The texture map (diffesetex) is used to represent the representation effect of the surface texture in the two-dimensional scene picture in the three-dimensional scene, i.e. the texture map can be regarded as a three-dimensional surface texture, i.e. a texture map capable of representing the real representation effect of the surface texture in the two-dimensional scene picture in the three-dimensional scene.
The height field (height field) is used for representing the height of the terrain in the two-dimensional scene picture in the three-dimensional scene, namely the height field can be regarded as the three-dimensional terrain, namely a topographic map capable of representing the real representation effect of the terrain in the two-dimensional scene picture in the three-dimensional scene.
The texture map and the height field correspond to the same scene area, so that when a three-dimensional scene picture is generated based on the texture map and the height field, the texture map can be added into a three-dimensional terrain represented by the height field to obtain a three-dimensional scene picture, namely, the texture map can be attached to the height field in a map-attaching mode to obtain the three-dimensional scene picture simultaneously having a space sense expression effect of a surface texture and a three-dimensional expression effect of the terrain.
The quality of the space sense expression effect of the earth surface texture in the three-dimensional scene picture depends on the quality of the texture mapping, and the quality of the stereoscopic impression expression effect of the high field in the three-dimensional scene picture depends on the quality of the high field. The generation of the texture map and the height field is exemplarily described below.
1. Texture map acquisition process
The surface texture represented by the texture data can be used as the brush texture, and the surface texture represented by the texture data is also the surface texture in the two-dimensional scene picture, i.e. the surface texture which can be directly obtained in the two-dimensional scene picture can be used as the brush texture. The brush texture can be considered as the base texture.
The transparency of the surface texture (i.e. the brush texture), i.e. the value of the Alpha channel, is obtained. The brush texture and its transparency may be referred to as (3-channel) Splatmap, which may reflect the brush texture and its transparency at various positions in the scene area corresponding to the two-dimensional scene picture.
And acquiring the normal texture and/or the illumination texture of the surface texture. The surface texture in a two-dimensional scene picture is usually not provided with normal texture and illumination texture. The normal texture is also called concave-convex texture, Bump Mapping, and the normal texture of the surface texture can be obtained to better simulate the scene expression effect of the real world. The illumination expression effect of the real world can be better simulated by acquiring the illumination texture of the surface texture. For the surface texture, the normal texture and the illumination texture of the surface texture can be obtained through various normal texture technologies and illumination texture technologies, and the obtaining mode of the normal texture and the illumination texture is not a side focus of the disclosure and is not repeated in the disclosure.
And according to the transparency of the surface texture, fusing the brush texture, the normal texture and/or the illumination texture of the surface texture to obtain a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene. The brush texture, the normal texture and/or the illumination texture of the earth surface texture are fused, namely the brush texture, the normal texture and/or the illumination texture at the same position are fused, and the texture map after fusion can reflect the expression effect of the earth surface texture in the three-dimensional scene.
As an example, texture data separated from a two-dimensional scene picture may include one or more layers of texture layer data, each layer of texture layer data corresponding to a surface texture, the texture layer data used to characterize the surface texture at different locations in a region of the scene to which the two-dimensional scene picture corresponds. The surface texture refers to the texture of surface objects at different positions in a scene area corresponding to the two-dimensional scene picture. Surface objects may include, but are not limited to, sand, grass, snow, rocks, dirt, trees, buildings, and thus surface texture may include one or more of the following: sand, grass, snow, rock, soil, trees, buildings.
Each layer of texture layer data can represent the distribution condition of the earth surface texture corresponding to the texture layer data in the whole scene area corresponding to the two-dimensional scene picture. Thus, the texture data of the entire scene area corresponding to the two-dimensional scene picture may be regarded as being composed of multiple layers of texture layer data, that is, the texture layer data may include, but is not limited to, one or more of a sand layer, a grass layer, a snow layer, a rock layer, a mud layer, a tree layer, a building layer, and the like.
The texture layer data may be in units of pixels, that is, the texture layer data may represent, in units of pixels, the distribution of the surface texture corresponding to the texture layer data at different pixels in the two-dimensional scene picture.
When the normal texture and/or the illumination texture of the surface texture is obtained, the normal texture and/or the illumination texture of each layer of texture layer data can be obtained according to each layer of texture layer data, and the brush texture, the normal texture and/or the illumination texture of all layers can be fused during fusion to obtain the texture map.
2. Acquisition process of height field
A base height field of at least a portion of a scene area in a two-dimensional scene picture is determined. The base height field represents the base height distribution of each position in a scene area corresponding to the two-dimensional scene picture.
The base height of each location may be related to the topography of each location, i.e. the base height of each location may be set according to the topography of each location. Alternatively, all the positions may share a base height, for example, a height value with the largest occurrence number (or probability) in the actual three-dimensional scene may be empirically used as the base height of each position, so that the workload may be reduced to some extent.
As an example, the base height field may be generated according to a noise algorithm. The corresponding basic height field can be generated by using different noise algorithms according to the terrains of different areas, so that the basic height field which is similar to the terrains and has more realistic sense can be obtained. The Noise algorithm may include, but is not limited to, Berlin Noise, sinusoidal Noise, Worley Noise (also known as Cell Noise), and the like.
After the basic height field is obtained, the basic height field can be processed according to the terrain of at least part of the scene area represented by the terrain data, and the height field capable of representing the terrain height of the terrain in the three-dimensional scene is obtained. The basic altitude field is processed in different ways according to different terrains.
For example, a portion of the bed elevation corresponding to the river terrain may be processed to excavate the surface river (i.e., to reduce the terrain height). The portion of the base height field corresponding to the mountain terrain may be subjected to a ridge process to generate a mountain (i.e., a heightened terrain height process).
As an example, terrain data separated from a two-dimensional scene picture may include one or more layers of terrain layer data, each layer of terrain layer data corresponding to a type of terrain, the terrain layer data being used to characterize terrain at different locations in a region of the scene to which the two-dimensional scene picture corresponds. The terrain may include, but is not limited to, sand, grass, snow, mountains, rivers, roads, hills, plains, basins, swamps.
Each layer of terrain layer data can represent the distribution condition of the terrain corresponding to the terrain layer data in the whole scene area corresponding to the two-dimensional scene picture. Therefore, the topographic data of the entire scene area corresponding to the two-dimensional scene picture may be regarded as being composed of multiple layers of topographic layer data, that is, the topographic layer data may include, but is not limited to, one or more of sand cover layer (i.e., sand Mask layer) data, grass cover layer (i.e., grass Mask layer) data, snow cover layer (i.e., snow Mask layer) data, mountain cover layer (i.e., mountain Mask layer) data, river cover layer data (i.e., river Mask layer), and road cover layer (i.e., road Mask layer) data.
The terrain data also comprises logic data used for representing the hierarchical relationship between terrains corresponding to different terrain layer data at the same position. Here, the hierarchical relationship is the relationship between coverage and coverage between different terrains. For example, a grass cover layer may cover a hill cover layer, and a snow cover layer may cover the grass cover layer.
The unit of the terrain layer data can be a pixel, that is, the terrain layer data can represent the distribution of the terrain corresponding to the terrain layer data at different pixels in the two-dimensional scene picture in the unit of pixel.
The unit of the terrain layer data can also be a cell (i.e. a grid) corresponding to a certain area, that is, the terrain layer data can represent the distribution situation of the terrain corresponding to the terrain layer data in different cells in the two-dimensional scene picture by taking the cell as the unit.
Taking the unit of terrain layer data as a cell as an example, when a height field capable of representing the terrain height of the terrain in a three-dimensional scene is obtained based on the terrain data, and logical operations such as and/or inequality can be performed on the terrain attributes of each cell according to the terrain layer data of the cell to generate a map logical attribute MaskMap of the cell, and the MaskMap can be used as a logical operation result of each layer of terrain data layer of a single cell and is used for representing the terrain attributes of different positions in the cell. The base elevation field may then be generated using different noise algorithms based on terrain attributes in the MaskMap, such as terrain at different locations. And performing secondary processing on the basic height field according to the landforms at different positions in the MaskMap so that the processed height field is more consistent with the real landform, and thus obtaining the height field capable of representing the 3D landform through the steps.
As shown in fig. 2, taking the data processing method of the present disclosure executed by the client device as an example, a plurality of first tasks (i.e., texture mapping tasks) for determining texture maps of different regions in a scene region may be generated by a Central Processing Unit (CPU) in the client device based on the scene region represented by the two-dimensional scene picture, the plurality of first tasks may be handed to a graphics processing unit (i.e., GPU) for execution, and/or
A plurality of second tasks for determining the height fields of different regions in the scene area (i.e., height field tasks) may be generated by a central processor in the client device based on the scene area represented by the two-dimensional scene picture, and the plurality of second tasks may be handed to a graphics processor for execution.
The central processing unit can also call a task execution result from the graphic processor, and generate a three-dimensional scene picture according to the task execution result returned by the graphic processor.
The texture mapping task and the height field task are both image processing tasks related to pixel-level intensive operation. The present disclosure generates a task (texture mapping task/height field task) for image processing involving pixel-level intensive operations by a CPU, transmits the task to the GPU, executes the task by the GPU, and transmits a task execution result to the CPU by using a hybrid architecture of the CPU + the GPU. Therefore, the performance is ensured on the basis of realizing the function by moving the logic operation from the CPU to the GPU.
By way of example, art assets and/or user setting data may also be obtained. The three-dimensional scene picture may be modified or corrected based on the art resources and/or the user setting data after the three-dimensional scene picture is generated, or the three-dimensional scene picture may be generated based on texture data, terrain data, art resources, and/or the user setting data in the process of generating the three-dimensional scene picture.
Art resources may be resources that can be reused or referenced to enhance the aesthetic appeal of the resulting three-dimensional scene picture, and may include, but are not limited to, 3D presentations of various materials. The user setting data may be three-dimensional parameter information (e.g., texture parameter, terrain height parameter) set by a user (e.g., a user of an application program for presenting a three-dimensional scene picture), or element information that the user desires to add in the three-dimensional scene picture.
By multiplexing art resources, the generated three-dimensional scene picture can be more refined. By generating the three-dimensional scene picture according to the user setting data, the generated three-dimensional scene picture can better meet the personalized watching requirements of the user.
The basic implementation flow of the data processing method of the present disclosure is described in detail with reference to fig. 1 and fig. 2.
The present disclosure also provides an application interface display method, which may be executed by a client device for running an application program, the application interface display method including: displaying an application interface, wherein the application interface comprises a two-dimensional scene picture; and responding to a switching request of a user, and displaying a three-dimensional scene picture generated based on the two-dimensional scene picture in the application interface. The three-dimensional scene may be generated in real time in response to the switching request, or the three-dimensional scene may be obtained in advance.
That is, the three-dimensional scene screen may be generated in real time in response to a switching request from the user, or may be generated in the background in advance, and the three-dimensional scene screen may be directly called and displayed in response to the switching request from the user.
The three-dimensional scene picture can be generated based on the resources of the two-dimensional scene picture, the content of the three-dimensional scene picture is consistent with that of the two-dimensional scene picture, but the picture representation effect is better than that of the two-dimensional scene picture.
For example, a three-dimensional scene picture may be generated using the data processing method of the present disclosure.
The present disclosure also provides a game interface display method, which may be executed by a client device for running a game, where the game may be a strategic simulation game based on tile interaction (such as a sand table strategic game), and a game scene may include a large number of tiles, and the game interface display method includes: responding to the switching operation of a user for the two-dimensional plots in the game interface, and generating three-dimensional plots based on map resources of the two-dimensional plots; and displaying a game interface including the three-dimensional parcel.
A two-dimensional tile refers to a tile used to characterize a two-dimensional game scene, i.e., a tile in a 2D map. In response to a switching operation of a user for a two-dimensional parcel in a game interface, a three-dimensional parcel, which is a parcel capable of representing a three-dimensional game scene, can be generated in real time based on map resources of the two-dimensional parcel.
FIG. 3 shows a schematic flow diagram of a process for generating a three-dimensional parcel based on map resources of a two-dimensional parcel. The process of generating a three-dimensional parcel based on map resources of a two-dimensional parcel is described in an exemplary manner with reference to fig. 3, and for the details involved therein, reference may be made to the description of the data processing method above.
Referring to FIG. 3, a plurality of texture layer data and a plurality of terrain layer data (also referred to as logical layer data) may be separated based on the map resources of a two-dimensional parcel (i.e., the contents of a two-dimensional parcel). The units of the texture layer may be pixels and may include, but are not limited to, a sand layer, a grass layer, and a snow layer. The unit of terrain layer data may be a cell (also referred to as a logical grid) and may include, but is not limited to, a sand Mask layer, a grass Mask layer, a snow Mask layer, a mountain Mask layer, a river Mask layer, a road Mask layer.
The process from texture layer data to texture map is as follows: taking the texture of each sub-image layer in the texture layer as a brush texture, and extracting Alpha channels in each sub-image layer to generate a 3-channel Splatmap in a combined mode; and taking the brush texture and the normal texture of each sub-layer as input, adding illumination operation of each sub-layer to generate illumination texture, and fusing according to Splatmap to generate the 3D earth surface DiffuseTex.
The process from the terrain layer data to the altitude field is as follows: carrying out AND, OR, unequal logic operation on the attribute of each logic grid in each sublayer in the logic layer to generate map logic attribute MaskMap; generating a basic HeightField by using different noise wave algorithms according to the logic lattice mask in the MaskMap; performing secondary treatment on the height field according to the river channel logic lattice in the MaskMap to dig out an earth surface river channel, and performing tertiary treatment and bulging on the height field according to the mountain body logic lattice in the MaskMap to generate a mountain body; and finally obtaining the 3D topographic height field through the steps.
And finally, generating a 3D terrain Mesh according to the height field, and attaching DiffuseTex to the 3D terrain Mesh to generate a 3D terrain.
Thus, the present disclosure may be implemented as a Procedural Terrain Generation (PTG) scheme. The PTG scheme of the present disclosure, which generates 3D terrain representation and logic in real time through 2D map representation and logic, may be utilized to quickly switch a 2D game screen (e.g., a 2D parcel) to a 3D game screen (e.g., a 3D parcel) in response to a switching operation of a player during a game run by a client.
The present disclosure can produce at least the following advantageous effects: 1) the whole process can be compatible with 2D and 3D, and the 2D and 3D can be switched in the game according to the preference of a player; 2) 2D resources are reused as much as possible, and large-scale re-manufacturing of art resources and consumption of art workload are not needed; 3) for a map with a very large map size of 2000x2000, if the map is manufactured in a traditional mode, the resource amount of an art map is very large, so that the game inclusion exceeds the product limit, and the problem does not exist in the method; 4) the performance and compatibility of the 3D version are consistent with those of the 2D version, and the 3D version art performance is better than that of the 2D version.
According to the scheme, the 2D map is converted into the 3D terrain in real time, and the fact that the user experience cannot be blocked is considered, so that the generation time of a single plot needs to be guaranteed to be less than 50 ms. For this reason, the whole processing flow in fig. 3 involves a large number of pixel-level operations, and in order to achieve the 50ms goal, the framework may adopt a parallel architecture of CPU + GPU in implementation, where the CPU performs main flow control and GPU input data initialization and output data acquisition, and the GPU performs pixel-level intensive operations.
In conclusion, the PTG scheme disclosed by the invention does not newly increase art resources, reuses 2D map resources to generate 3D terrain in real time, and art does not need to make a set of 3D resources again, so that the workload, the making period and the cost of the art are not increased. Art resources are not greatly increased, so the inclusion is not greatly increased. Moreover, the operation-intensive tasks are mainly performed in a centralized mode on the GPU, the advantage of GPU operation intensity is exerted, and the performance is not greatly influenced; the 3D version performance and compatibility can be guaranteed to be basically consistent with 2D through testing.
In one embodiment of the disclosure, an application interface display method is also provided. The application interface display method may be performed by a client device running an application program. The application interface display method may include the following steps.
At step SA1, an application interface is displayed, the application interface including a two-dimensional scene picture.
At step SA2, in response to a switching request by a user, it is determined whether there is a three-dimensional scene picture generated based on a two-dimensional scene picture.
It is determined whether a three-dimensional scene picture generated based on the two-dimensional scene picture exists, that is, whether a three-dimensional scene picture generated in advance exists. The three-dimensional scene may be generated by the client device or in the background (e.g., by a background server).
At step SA3, if a three-dimensional scene picture is present, the three-dimensional scene picture is displayed, and if no three-dimensional scene picture is present, a three-dimensional scene picture is generated based on a two-dimensional scene picture, and the generated three-dimensional scene picture is displayed. Taking the generation of the three-dimensional scene picture at the server as an example, if the three-dimensional scene picture exists at the server, the three-dimensional scene picture can be acquired from the server.
Therefore, when the pre-generated three-dimensional scene picture exists in the background, the three-dimensional scene picture can be directly displayed, and when the pre-generated three-dimensional scene picture does not exist in the background, the three-dimensional scene picture can be generated by the client device on the basis of the two-dimensional scene picture in real time. For the implementation process of generating the three-dimensional scene picture based on the two-dimensional scene picture, reference may be made to the above related description, which is not repeated in this embodiment.
In one embodiment of the present disclosure, an auxiliary operation method is also provided. The secondary operation method may be performed by a client device or a server, which may provide a user (e.g., a designer) with a 2D to 3D tool by performing the secondary operation method of the present disclosure. The secondary operation method may include the following steps.
At step SB1, a two-dimensional scene image input by a user is received. The two-dimensional scene image input by the user, that is, the two-dimensional scene image which needs to be converted into the three-dimensional scene image.
At step SB2, in response to a three-dimensional conversion request from the user for a two-dimensional scene image, a three-dimensional scene image is generated based on the two-dimensional scene image. The two-dimensional scene image in this embodiment is equivalent to the two-dimensional scene image mentioned above, and the three-dimensional scene image is equivalent to the three-dimensional scene image, and the implementation process of generating the three-dimensional scene image based on the two-dimensional scene image may refer to the description of the implementation process of generating the three-dimensional scene image based on the two-dimensional scene image above, which is not repeated in this embodiment.
At step SB3, the three-dimensional scene image is provided to the user.
The present disclosure may further receive an adjustment parameter set by the user for the three-dimensional scene image, wherein step SB2 may include: a three-dimensional scene image is generated based on the two-dimensional scene image and the adjustment parameter such that the parameter of the generated three-dimensional scene image conforms to the adjustment parameter. Wherein the adjustment parameters may include, but are not limited to, adjustments to the surface texture and/or topography.
In one embodiment of the present disclosure, an application interface display method is also provided, which may be performed by a client device running an application program. The application interface display method may include the following steps.
At step SC1, an application interface is displayed, the application interface including a two-dimensional scene picture.
In step SC2, in response to the switching request by the user, a three-dimensional scene picture is generated based on the adjustment parameters set by the user and the two-dimensional scene picture. The adjustment parameters, i.e., the 3D display parameters set by the user, may include, but are not limited to, adjustments to the surface texture and/or the terrain. For the implementation process of generating the three-dimensional scene picture based on the two-dimensional scene picture, reference may be made to the above related description, which is not repeated in this embodiment.
In step SC3, a three-dimensional scene is displayed in the application interface.
Thereby, a three-dimensional scene screen in accordance with the user preference can be generated and displayed.
In one embodiment of the present disclosure, an application interface display method is also provided, which may be performed by a client device running an application program. The application interface display method may include the following steps.
In step SD1, an application interface is displayed, and the application interface includes a two-dimensional scene picture.
At step SD2, in response to the execution environment of the application program meeting the preset condition, a three-dimensional scene picture is generated based on the two-dimensional scene picture. After the three-dimensional scene is generated, the three-dimensional scene may be displayed in an application interface. Thus, whether to convert from 2D to 3D may be determined according to the execution environment of the application. The running environment of the application program meets the preset condition, which may mean that the client device running the application program is in an idle state (for example, a state where available resources of the CPU/GPU are sufficient), or in a wifi connection state.
As an example, the three-dimensional scene picture may be modified, and the modified three-dimensional scene picture may be displayed in the application interface. Wherein, the modification can be carried out in the background or by the user. For example, background art resources can be reused and modified by background designers.
The data processing method of the present disclosure may also be implemented as a data processing apparatus. Fig. 4 shows a schematic structural diagram of a data processing apparatus according to an embodiment of the present disclosure. The functional elements of the data processing apparatus may be implemented by hardware, software, or a combination of hardware and software implementing the principles of the present disclosure. It will be appreciated by those skilled in the art that the functional units described in fig. 4 may be combined or divided into sub-units to implement the principles of the invention described above. Thus, the description herein may support any possible combination, or division, or further definition of the functional units described herein.
In the following, functional units that the data processing apparatus may have and operations that each functional unit may perform are briefly described, and for details related thereto, reference may be made to the above-mentioned related description, which is not described herein again.
Referring to fig. 4, the data processing apparatus 400 includes an acquisition module 410 and a generation module 420. The data processing apparatus 400 may be deployed on a server side or a client device.
The obtaining module 410 is configured to obtain texture data and terrain data based on the two-dimensional scene picture, where the texture data is used to represent a surface texture in the two-dimensional scene picture, and the terrain data is used to represent a terrain in the two-dimensional scene picture. The generating module 420 is configured to generate a three-dimensional scene picture based on the texture data and the terrain data.
The generation module 420 may include a first generation module, a second generation module, and a third generation module. The first generation module is used for obtaining a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene based on the texture data; the second generation module is used for obtaining a height field capable of reflecting the terrain height of the terrain in the three-dimensional scene based on the terrain data; and the third generation module is used for generating a three-dimensional scene picture based on the texture map and the height field.
The first generating module may be specifically configured to: taking the surface texture represented by the texture data as a brush texture, and acquiring the transparency of the surface texture; acquiring a normal texture and/or an illumination texture of a ground surface texture; and according to the transparency of the surface texture, fusing the brush texture, the normal texture and/or the illumination texture of the surface texture to obtain a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene.
The second generating module may be specifically configured to: determining a base height field of at least a part of a scene area in a two-dimensional scene picture; and processing the basic height field according to the terrain of at least part of the scene area represented by the terrain data to obtain a height field capable of representing the terrain height in the three-dimensional scene.
The third generating module may be specifically configured to: and adding the texture map into the three-dimensional terrain represented by the height place to obtain a three-dimensional scene picture.
Taking the example of the data processing apparatus 400 deployed on a client device, a plurality of first tasks for determining texture maps of different regions in a scene region may be generated by a central processor in the client device based on the scene region represented by a two-dimensional scene picture, the plurality of first tasks being handed to a graphics processor for execution, and/or a plurality of second tasks for determining height fields of different regions in the scene region may be generated by a central processor in the client device based on the scene region represented by a two-dimensional scene picture, the plurality of second tasks being handed to a graphics processor for execution.
The texture data may comprise one or more layers of texture layer data, each layer of texture layer data corresponding to a surface texture, the texture layer data being used to characterize the surface texture at different locations in a corresponding scene region of the two-dimensional scene picture.
The surface texture may include one or more of: sand, grass, snow, rock, soil, trees, buildings.
The terrain data may include one or more layers of terrain layer data, each layer of terrain layer data corresponding to a terrain, the terrain layer data being used to characterize the terrain at different positions in a scene region corresponding to the two-dimensional scene picture, the terrain data further including logic data used to characterize the hierarchical relationship between the terrains corresponding to different terrain layer data at the same position.
The terrain layer data may include one or more of: sand cover layer data, grass cover layer data, snow cover layer data, mountain cover layer data, river cover layer data, and road cover layer data.
The obtaining module 410 may also be used to obtain art assets and/or user setting data, as examples. The data processing apparatus 400 may further include a modification module for modifying the three-dimensional scene picture based on the art resources and/or the user setting data. Or the generating module 420 may generate a three-dimensional scene picture based on texture data, terrain data, art resources, and/or user setup data.
The application interface display method of the present disclosure may also be implemented as an application interface display apparatus. Fig. 5 shows a schematic structural diagram of an application interface display device according to an embodiment of the present disclosure. The functional elements of the application interface display apparatus may be implemented by hardware, software, or a combination of hardware and software implementing the principles of the present disclosure. It will be appreciated by those skilled in the art that the functional units described in fig. 5 may be combined or divided into sub-units to implement the principles of the invention described above. Thus, the description herein may support any possible combination, or division, or further definition of the functional units described herein.
In the following, brief descriptions are given to functional units that the application interface display device can have and operations that each functional unit can perform, and for details related thereto, reference may be made to the above-mentioned related descriptions, which are not described herein again.
Referring to fig. 5, the application interface display device 500 includes a display module 510. The application interface display apparatus 500 may be deployed on a client device.
The display module 510 is configured to display an application interface, where the application interface includes a two-dimensional scene picture, and the display module 510 is further configured to display a three-dimensional scene picture generated based on the two-dimensional scene picture in the application interface in response to a switching request of a user.
The three-dimensional scene may be generated in real time in response to a switching request, or the three-dimensional scene may be previously obtained
Taking the example that the three-dimensional scene picture is generated in real time in response to the switching request, the application interface display apparatus 500 may further include a data processing module. The data processing module is used for obtaining texture data and topographic data based on the two-dimensional scene picture, the texture data is used for representing the earth surface texture in the two-dimensional scene picture, and the topographic data is used for representing the topography in the two-dimensional scene picture; and generating a three-dimensional scene picture based on the texture data and the terrain data.
The data processing module may include a first generation module, a second generation module, and a third generation module. The first generation module is used for obtaining a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene based on the texture data; the second generation module is used for obtaining a height field capable of reflecting the terrain height of the terrain in the three-dimensional scene based on the terrain data; and the third generation module is used for generating a three-dimensional scene picture based on the texture map and the height field.
The first generating module may be specifically configured to: taking the surface texture represented by the texture data as a brush texture, and acquiring the transparency of the surface texture; acquiring a normal texture and/or an illumination texture of a ground surface texture; and according to the transparency of the surface texture, fusing the brush texture, the normal texture and/or the illumination texture of the surface texture to obtain a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene.
The second generating module may be specifically configured to: determining a base height field of at least a part of a scene area in a two-dimensional scene picture; and processing the basic height field according to the terrain of at least part of the scene area represented by the terrain data to obtain a height field capable of representing the terrain height in the three-dimensional scene.
The third generating module may be specifically configured to: and adding the texture map into the three-dimensional terrain represented by the height place to obtain a three-dimensional scene picture.
The data processing module can also modify the three-dimensional scene picture based on art resources and/or user setting data, or the data processing module can also generate the three-dimensional scene picture based on texture data, terrain data, art resources and/or user setting data.
The game interface display method of the present disclosure may also be implemented as a game interface display device. Fig. 6 shows a schematic structural diagram of a game interface display device according to an embodiment of the present disclosure. The functional elements of the game interface display device may be implemented by hardware, software, or a combination of hardware and software that implements the principles of the present disclosure. It will be appreciated by those skilled in the art that the functional units described in fig. 6 may be combined or divided into sub-units to implement the principles of the invention described above. Thus, the description herein may support any possible combination, or division, or further definition of the functional units described herein.
In the following, brief descriptions are given to functional units that the game interface display device can have and operations that each functional unit can perform, and details related thereto may be referred to the above description, and are not repeated here.
Referring to fig. 6, the game interface display apparatus 600 includes a generation module 610 and a display module 620.
The generating module 610 is configured to generate a three-dimensional parcel based on map resources of a two-dimensional parcel in response to a user switching operation for the two-dimensional parcel in the game interface. The display module 620 is used to display a game interface including a three-dimensional parcel.
The generation module 610 may include a separation module and a generation sub-module. The separation module is used for obtaining texture data and terrain data based on map resources of the two-dimensional land parcel, the texture data is used for representing the surface texture of the two-dimensional land parcel, and the terrain data is used for representing the terrain of the two-dimensional land parcel. The generation submodule is used for generating a three-dimensional land block based on the texture data and the terrain data.
The generation submodule may include a first generation submodule, a second generation submodule, and a third generation submodule. The first generation submodule is used for obtaining a texture map which can embody the representation effect of the surface texture in the three-dimensional scene based on the texture data; the second generation module is used for obtaining a height field capable of reflecting the terrain height of the terrain in the three-dimensional scene based on the terrain data; the third generation module is used for generating a three-dimensional land parcel based on the texture map and the height field.
The first generation submodule may be specifically configured to: taking the surface texture represented by the texture data as a brush texture, and acquiring the transparency of the surface texture; acquiring a normal texture and/or an illumination texture of a ground surface texture; and according to the transparency of the surface texture, fusing the brush texture, the normal texture and/or the illumination texture of the surface texture to obtain a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene.
The second generation submodule may be specifically configured to: determining a base height field of at least part of a scene area in a two-dimensional parcel; and processing the basic height field according to the terrain of at least part of the scene area represented by the terrain data to obtain a height field capable of representing the terrain height in the three-dimensional scene.
The third generating module may be specifically configured to: and adding the texture map into the three-dimensional terrain represented by the height place to obtain a three-dimensional plot.
As an example, the game interface display apparatus 600 may further include a modification module for modifying the three-dimensional scene picture based on the art resources and/or the user setting data, or the generation module may further generate the three-dimensional scene picture based on the texture data, the terrain data, the art resources, and/or the user setting data.
The present disclosure also provides a data processing system, comprising: the image processing system comprises a central processing unit and an image processing unit, wherein the central processing unit is used for generating a task aiming at image processing related to pixel-level intensive operation, sending the task to the image processing unit, and the image processing unit is used for executing the task and sending a task execution result to the central processing unit. The image processing involving pixel-level intensive operations may refer to the processing flow in the data processing method described above. The tasks generated by the central processor for image processing involving pixel-level intensive operations may include the first and second tasks mentioned above. The central processor may generate a three-dimensional scene picture, such as a three-dimensional parcel, based on the received execution result of the first task and the execution result of the second task.
The present disclosure also provides an application interface display device, including: the display module is used for displaying an application interface, and the application interface comprises a two-dimensional scene picture; the judging module is used for responding to a switching request of a user and judging whether a three-dimensional scene picture generated based on a two-dimensional scene picture exists or not; the display module is used for displaying the three-dimensional scene picture or displaying the three-dimensional scene picture generated by the generation module if the three-dimensional scene picture exists.
The generation module may include a separation module and a generation submodule. The separation module is used for obtaining texture data and terrain data based on the two-dimensional scene picture, the texture data is used for representing the earth surface texture in the two-dimensional scene picture, and the terrain data is used for representing the terrain in the two-dimensional scene picture. The generation submodule is used for generating a three-dimensional scene picture based on the texture data and the terrain data.
The generation submodule may include a first generation submodule, a second generation submodule, and a third generation submodule. The first generation submodule is used for obtaining a texture map which can embody the representation effect of the surface texture in the three-dimensional scene based on the texture data; the second generation module is used for obtaining a height field capable of reflecting the terrain height of the terrain in the three-dimensional scene based on the terrain data; and the third generation module is used for generating a three-dimensional scene picture based on the texture map and the height field.
The first generation submodule may be specifically configured to: taking the surface texture represented by the texture data as a brush texture, and acquiring the transparency of the surface texture; acquiring a normal texture and/or an illumination texture of a ground surface texture; and according to the transparency of the surface texture, fusing the brush texture, the normal texture and/or the illumination texture of the surface texture to obtain a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene.
The second generation submodule may be specifically configured to: determining a base height field of at least a part of a scene area in a two-dimensional scene picture; and processing the basic height field according to the terrain of at least part of the scene area represented by the terrain data to obtain a height field capable of representing the terrain height in the three-dimensional scene.
The third generating module may be specifically configured to: and adding the texture map into the three-dimensional terrain represented by the height place to obtain a three-dimensional scene picture.
As an example, the application interface display apparatus may further include a modification module configured to modify the three-dimensional scene picture based on the art resources and/or the user setting data, or the generation module may further generate the three-dimensional scene picture based on texture data, terrain data, art resources, and/or the user setting data.
The present disclosure also provides an auxiliary operating device, including: the receiving module is used for receiving a two-dimensional scene image input by a user; the generating module is used for responding to a three-dimensional conversion request of a user for the two-dimensional scene image, and generating a three-dimensional scene image based on the two-dimensional scene image; and a providing module for providing the three-dimensional scene image to the user.
As an example, the receiving module may be further configured to receive an adjustment parameter set by a user for the three-dimensional scene image, wherein the generating module may generate the three-dimensional scene image based on the two-dimensional scene image and the adjustment parameter so that the parameter of the generated three-dimensional scene image conforms to the adjustment parameter.
The generation module may include a separation module and a generation submodule. The separation module is used for obtaining texture data and terrain data based on the two-dimensional scene image, the texture data is used for representing the earth surface texture in the two-dimensional scene image, and the terrain data is used for representing the terrain in the two-dimensional scene image. The generation submodule is used for generating a three-dimensional scene image based on the texture data and the terrain data.
The generation submodule may include a first generation submodule, a second generation submodule, and a third generation submodule. The first generation submodule is used for obtaining a texture map which can embody the representation effect of the surface texture in the three-dimensional scene based on the texture data; the second generation module is used for obtaining a height field capable of reflecting the terrain height of the terrain in the three-dimensional scene based on the terrain data; and the third generation module is used for generating a three-dimensional scene image based on the texture map and the height field.
The first generation submodule may be specifically configured to: taking the surface texture represented by the texture data as a brush texture, and acquiring the transparency of the surface texture; acquiring a normal texture and/or an illumination texture of a ground surface texture; and according to the transparency of the surface texture, fusing the brush texture, the normal texture and/or the illumination texture of the surface texture to obtain a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene.
The second generation submodule may be specifically configured to: determining a base height field of at least part of a scene area in a two-dimensional scene image; and processing the basic height field according to the terrain of at least part of the scene area represented by the terrain data to obtain a height field capable of representing the terrain height in the three-dimensional scene.
The third generating module may be specifically configured to: and adding the texture map into the three-dimensional terrain represented by the height place to obtain a three-dimensional scene image.
As an example, the secondary operating device may further include a modification module for modifying the three-dimensional scene picture based on the art resources and/or the user setting data, or the generation module may further generate the three-dimensional scene picture based on the texture data, the terrain data, the art resources, and/or the user setting data.
The present disclosure also provides an application interface display device, including: the display module is used for displaying an application interface, and the application interface comprises a two-dimensional scene picture; the generating module is used for responding to a switching request of a user and generating a three-dimensional scene picture based on an adjusting parameter and a two-dimensional scene picture set by the user; and the display module is also used for displaying the three-dimensional scene picture in the application interface.
The generation module may include a separation module, an adjustment module, and a generation submodule. The separation module is used for obtaining texture data and terrain data based on the two-dimensional scene image, the texture data is used for representing the earth surface texture in the two-dimensional scene image, and the terrain data is used for representing the terrain in the two-dimensional scene image. The adjusting module may adjust the texture data and the terrain data based on the adjusting parameter, and the generating sub-module may be configured to generate the three-dimensional scene image based on the adjusted texture data and the terrain data.
The generation submodule may include a first generation submodule, a second generation submodule, and a third generation submodule. The first generation submodule is used for obtaining a texture map which can embody the representation effect of the surface texture in the three-dimensional scene based on the texture data; the second generation module is used for obtaining a height field capable of reflecting the terrain height of the terrain in the three-dimensional scene based on the terrain data; the adjusting module may adjust the texture map and/or the height field, and the third generating module may be configured to generate the three-dimensional scene image based on the adjusted texture map and the height field.
The first generation submodule may be specifically configured to: taking the surface texture represented by the texture data as a brush texture, and acquiring the transparency of the surface texture; acquiring a normal texture and/or an illumination texture of a ground surface texture; and according to the transparency of the surface texture, fusing the brush texture, the normal texture and/or the illumination texture of the surface texture to obtain a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene.
The second generation submodule may be specifically configured to: determining a base height field of at least part of a scene area in a two-dimensional scene image; and processing the basic height field according to the terrain of at least part of the scene area represented by the terrain data to obtain a height field capable of representing the terrain height in the three-dimensional scene.
The third generating module may be specifically configured to: and adding the adjusted texture map into the three-dimensional terrain represented by the height place to obtain a three-dimensional scene image.
As an example, the application interface display apparatus may further include a modification module configured to modify the three-dimensional scene picture based on the art resources and/or the user setting data, or the generation module may further generate the three-dimensional scene picture based on texture data, terrain data, art resources, and/or the user setting data.
The present disclosure also provides an application interface display device, including: the display module is used for displaying an application interface, and the application interface comprises a two-dimensional scene picture; and the generating module is used for responding to the condition that the running environment of the application program accords with the preset condition and generating a three-dimensional scene picture based on the two-dimensional scene picture.
The generation module may include a separation module and a generation submodule. The separation module is used for obtaining texture data and terrain data based on the two-dimensional scene picture, the texture data is used for representing the earth surface texture in the two-dimensional scene picture, and the terrain data is used for representing the terrain in the two-dimensional scene picture. The generation submodule is used for generating a three-dimensional scene picture based on the texture data and the terrain data.
The generation submodule may include a first generation submodule, a second generation submodule, and a third generation submodule. The first generation submodule is used for obtaining a texture map which can embody the representation effect of the surface texture in the three-dimensional scene based on the texture data; the second generation module is used for obtaining a height field capable of reflecting the terrain height of the terrain in the three-dimensional scene based on the terrain data; and the third generation module is used for generating a three-dimensional scene picture based on the texture map and the height field.
The first generation submodule may be specifically configured to: taking the surface texture represented by the texture data as a brush texture, and acquiring the transparency of the surface texture; acquiring a normal texture and/or an illumination texture of a ground surface texture; and according to the transparency of the surface texture, fusing the brush texture, the normal texture and/or the illumination texture of the surface texture to obtain a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene.
The second generation submodule may be specifically configured to: determining a base height field of at least a part of a scene area in a two-dimensional scene picture; and processing the basic height field according to the terrain of at least part of the scene area represented by the terrain data to obtain a height field capable of representing the terrain height in the three-dimensional scene.
The third generating module may be specifically configured to: and adding the texture map into the three-dimensional terrain represented by the height place to obtain a three-dimensional scene picture.
As an example, the application interface display apparatus may further include a modification module configured to modify the three-dimensional scene picture based on the art resources and/or the user setting data, or the generation module may further generate the three-dimensional scene picture based on texture data, terrain data, art resources, and/or the user setting data.
Fig. 7 is a schematic structural diagram of a computing device that can be used to implement the data processing method or the application interface display method or the game interface display method according to an embodiment of the present invention.
Referring to fig. 7, computing device 700 includes memory 710 and processor 720.
Processor 720 may be a multi-core processor or may include multiple processors. In some embodiments, processor 720 may include a general-purpose host processor and one or more special purpose coprocessors such as a Graphics Processor (GPU), Digital Signal Processor (DSP), or the like. In some embodiments, processor 720 may be implemented using custom circuits, such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).
The memory 710 may include various types of storage units, such as system memory, Read Only Memory (ROM), and permanent storage. Wherein the ROM may store static data or instructions that are required by processor 720 or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. In addition, the memory 710 may include any combination of computer-readable storage media, including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic and/or optical disks, may also be employed. In some embodiments, memory 710 may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a digital versatile disc read only (e.g., DVD-ROM, dual layer DVD-ROM), a Blu-ray disc read only, an ultra-dense disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disk, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.
The memory 710 has stored thereon executable code, which when processed by the processor 720, causes the processor 720 to perform the above-mentioned data processing method or application interface display method or game interface display method.
The data processing method, the application interface display method, the game interface display method, the apparatus and the device for performing the respective methods according to the present invention have been described in detail above with reference to the accompanying drawings.
Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for carrying out the above-mentioned steps defined in the above-mentioned method of the invention.
Alternatively, the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the steps of the above-described method according to the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (34)

1. A method of data processing, comprising:
obtaining texture data and terrain data based on a two-dimensional scene picture, wherein the texture data is used for representing earth surface textures in the two-dimensional scene picture, and the terrain data is used for representing terrain in the two-dimensional scene picture;
and generating a three-dimensional scene picture based on the texture data and the terrain data.
2. The method of claim 1, wherein generating a three-dimensional scene picture based on the texture data and the terrain data comprises:
obtaining a texture map capable of showing the representation effect of the earth surface texture in the three-dimensional scene based on the texture data;
obtaining a height field capable of reflecting the terrain height of the terrain in a three-dimensional scene based on the terrain data;
and generating the three-dimensional scene picture based on the texture mapping and the height field.
3. The method of claim 2, wherein the step of deriving a texture map that embodies the representation of the surface texture in the three-dimensional scene based on the texture data comprises:
taking the surface texture represented by the texture data as a brush texture, and acquiring the transparency of the surface texture;
acquiring a normal texture and/or an illumination texture of the surface texture;
and according to the transparency of the surface texture, fusing the brush texture, the normal texture and/or the illumination texture of the surface texture to obtain a texture map capable of showing the representation effect of the surface texture in the three-dimensional scene.
4. The method of claim 2, wherein deriving a height field capable of characterizing a terrain height of the terrain in the three-dimensional scene based on the terrain data comprises:
determining a base height field of at least a portion of a scene region in the two-dimensional scene picture;
and processing the basic height field according to the terrain of at least part of the scene area represented by the terrain data to obtain a height field capable of representing the terrain height of the terrain in the three-dimensional scene.
5. The method of claim 2, wherein deriving three-dimensional scene data based on the texture map and the height field comprises:
and adding the texture map into the three-dimensional terrain represented by the height place to obtain the three-dimensional scene picture.
6. The method of claim 2, wherein the method is performed by a client device,
generating, by a central processor in the client device, a plurality of first tasks for determining texture maps of different areas in a scene area based on the scene area represented by the two-dimensional scene picture, handing the plurality of first tasks to a graphics processor for execution, and/or
Generating, by a central processor in the client device, a plurality of second tasks for determining a height field of different ones of the scene regions based on the scene region characterized by the two-dimensional scene picture, the plurality of second tasks being handed to the graphics processor for execution.
7. The method of claim 1, wherein the texture data comprises one or more layers of texture layer data, each layer of texture layer data corresponding to a surface texture, the texture layer data being used to characterize the surface texture at different locations in a corresponding scene region of the two-dimensional scene picture.
8. The method of claim 7, wherein the surface texture comprises one or more of: sand, grass, snow, rock, soil, trees, buildings.
9. A method according to claim 1, wherein the terrain data comprises one or more layers of terrain data, each layer of terrain data corresponding to a terrain, the terrain layer data being indicative of the terrain at different locations in a region of the scene to which the two-dimensional scene picture corresponds, the terrain data further comprising logic data for indicating a hierarchical relationship between the terrains to which the different terrain layer data correspond at the same location.
10. The method of claim 9, wherein the terrain layer data comprises one or more of: sand cover layer data, grass cover layer data, snow cover layer data, mountain cover layer data, river cover layer data, and road cover layer data.
11. The method of claim 1, further comprising:
obtaining art resources and/or user setting data;
the step of modifying the three-dimensional scene picture based on the art resources and/or the user setting data, or generating the three-dimensional scene picture based on the texture data and the terrain data includes: and generating a three-dimensional scene picture based on the texture data, the terrain data, the art resources and/or the user setting data.
12. An application interface display method, comprising:
displaying an application interface, wherein the application interface comprises a two-dimensional scene picture;
and responding to a switching request of a user, and displaying a three-dimensional scene picture generated based on the two-dimensional scene picture in the application interface.
13. The method of claim 12, wherein the three-dimensional scene picture is generated in real-time in response to the switching request or the three-dimensional scene picture is pre-obtained.
14. The method of claim 12, wherein,
the three-dimensional scene is generated using the data processing method of any one of claims 1 to 11.
15. An application interface display method, comprising:
displaying an application interface, wherein the application interface comprises a two-dimensional scene picture;
responding to a switching request of a user, and judging whether a three-dimensional scene picture generated based on the two-dimensional scene picture exists or not;
and if the three-dimensional scene picture exists, displaying the three-dimensional scene picture, and if the three-dimensional scene picture does not exist, generating the three-dimensional scene picture based on the two-dimensional scene picture and displaying the generated three-dimensional scene picture.
16. A method of secondary operation, comprising:
receiving a two-dimensional scene image input by a user;
generating a three-dimensional scene image based on the two-dimensional scene image in response to a three-dimensional conversion request of a user for the two-dimensional scene image; and
and providing the three-dimensional scene image to a user.
17. The method of claim 16, wherein the three-dimensional scene image is generated using the data processing method of any of claims 1 to 11.
18. The method of claim 16, further comprising:
receiving adjustment parameters set by a user for a three-dimensional scene image, wherein the step of generating the three-dimensional scene image based on the two-dimensional scene image comprises the following steps: generating a three-dimensional scene image based on the two-dimensional scene image and the adjustment parameter such that a parameter of the generated three-dimensional scene image conforms to the adjustment parameter.
19. An application interface display method, comprising:
displaying an application interface, wherein the application interface comprises a two-dimensional scene picture;
responding to a switching request of a user, and generating a three-dimensional scene picture based on an adjusting parameter set by the user and the two-dimensional scene picture; and
and displaying the three-dimensional scene picture in the application interface.
20. An application interface display method, comprising:
displaying an application interface, wherein the application interface comprises a two-dimensional scene picture;
and responding to the condition that the running environment of the application program accords with the preset condition, and generating a three-dimensional scene picture based on the two-dimensional scene picture.
21. The method of claim 20, further comprising:
modifying the three-dimensional scene picture;
and displaying the modified three-dimensional scene picture in the application interface.
22. A game interface display method, comprising:
generating a three-dimensional parcel based on map resources of a two-dimensional parcel in response to a switching operation of a user for the two-dimensional parcel in a game interface; and
displaying a game interface including the three-dimensional parcel.
23. The method of claim 22, wherein generating a three-dimensional parcel based on map resources of the two-dimensional parcel comprises:
obtaining texture data and terrain data based on map resources of the two-dimensional land parcel, wherein the texture data is used for representing the surface texture in the two-dimensional land parcel, and the terrain data is used for representing the terrain in the two-dimensional land parcel;
generating the three-dimensional parcel based on the texture data and the terrain data.
24. The method of claim 22, wherein generating the three-dimensional parcel based on the texture data and the terrain data comprises:
obtaining a texture map capable of showing the representation effect of the earth surface texture in the three-dimensional scene based on the texture data;
obtaining a height field capable of reflecting the terrain height of the terrain in a three-dimensional scene based on the terrain data;
generating the three-dimensional parcel based on the texture map and the height field.
25. A data processing system comprising: a central processing unit and a graphic processing unit,
the central processing unit is used for generating a task aiming at image processing related to pixel-level intensive operation, sending the task to the graphics processing unit, and the graphics processing unit is used for executing the task and sending a task execution result to the central processing unit.
26. A data processing apparatus comprising:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring texture data and topographic data based on a two-dimensional scene picture, the texture data is used for representing the surface texture in the two-dimensional scene picture, and the topographic data is used for representing the topography in the two-dimensional scene picture;
and the generating module is used for generating a three-dimensional scene picture based on the texture data and the terrain data.
27. An application interface display device comprising:
a display module for displaying an application interface, wherein the application interface comprises a two-dimensional scene picture,
the display module is further used for responding to a switching request of a user and displaying a three-dimensional scene picture generated based on the two-dimensional scene picture in the application interface.
28. An application interface display device comprising:
the display module is used for displaying an application interface, and the application interface comprises a two-dimensional scene picture;
the judging module is used for responding to a switching request of a user and judging whether a three-dimensional scene picture generated based on the two-dimensional scene picture exists or not;
the display module is further configured to display the three-dimensional scene picture or the three-dimensional scene picture generated by the generation module if the three-dimensional scene picture exists.
29. An auxiliary operating device comprising:
the receiving module is used for receiving a two-dimensional scene image input by a user;
a generating module, configured to generate a three-dimensional scene image based on the two-dimensional scene image in response to a three-dimensional conversion request for the two-dimensional scene image by a user; and
and the providing module is used for providing the three-dimensional scene image for a user.
30. An application interface display device comprising:
the display module is used for displaying an application interface, and the application interface comprises a two-dimensional scene picture;
the generating module is used for responding to a switching request of a user and generating a three-dimensional scene picture based on an adjusting parameter set by the user and the two-dimensional scene picture; and
the display module is further used for displaying the three-dimensional scene picture in the application interface.
31. An application interface display device comprising:
the display module is used for displaying an application interface, and the application interface comprises a two-dimensional scene picture;
and the generating module is used for responding to the condition that the running environment of the application program accords with the preset condition and generating a three-dimensional scene picture based on the two-dimensional scene picture.
32. A game interface display device, comprising:
the generation module is used for responding to the switching operation of a user for a two-dimensional parcel in a game interface and generating a three-dimensional parcel based on map resources of the two-dimensional parcel;
and the display module is used for displaying a game interface comprising the three-dimensional plot.
33. A computing device, comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any one of claims 1 to 24.
34. A non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the method of any of claims 1-24.
CN202011508274.3A 2020-12-14 2020-12-18 Data processing method, application interface display device and auxiliary operation method and device Pending CN113298927A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011469530 2020-12-14
CN2020114695302 2020-12-14

Publications (1)

Publication Number Publication Date
CN113298927A true CN113298927A (en) 2021-08-24

Family

ID=77318733

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011508274.3A Pending CN113298927A (en) 2020-12-14 2020-12-18 Data processing method, application interface display device and auxiliary operation method and device

Country Status (1)

Country Link
CN (1) CN113298927A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105740256A (en) * 2014-12-09 2016-07-06 高德信息技术有限公司 Generation method and generation device of three-dimensional map
CN105913485A (en) * 2016-04-06 2016-08-31 北京小小牛创意科技有限公司 Three-dimensional virtual scene generation method and device
CN106649466A (en) * 2016-09-27 2017-05-10 西安电子科技大学 Method for obtaining geometrical parameters of typical terrains in digital map
CN107527038A (en) * 2017-08-31 2017-12-29 复旦大学 A kind of three-dimensional atural object automatically extracts and scene reconstruction method
CN109741446A (en) * 2018-12-12 2019-05-10 四川华控图形科技有限公司 A kind of method of the fine coastal landform of dynamic generation in three-dimensional digital earth
CN112035694A (en) * 2020-08-18 2020-12-04 南京南瑞继保电气有限公司 Lightweight front-end GIS two-three-dimensional integrated display method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105740256A (en) * 2014-12-09 2016-07-06 高德信息技术有限公司 Generation method and generation device of three-dimensional map
CN105913485A (en) * 2016-04-06 2016-08-31 北京小小牛创意科技有限公司 Three-dimensional virtual scene generation method and device
CN106649466A (en) * 2016-09-27 2017-05-10 西安电子科技大学 Method for obtaining geometrical parameters of typical terrains in digital map
CN107527038A (en) * 2017-08-31 2017-12-29 复旦大学 A kind of three-dimensional atural object automatically extracts and scene reconstruction method
CN109741446A (en) * 2018-12-12 2019-05-10 四川华控图形科技有限公司 A kind of method of the fine coastal landform of dynamic generation in three-dimensional digital earth
CN112035694A (en) * 2020-08-18 2020-12-04 南京南瑞继保电气有限公司 Lightweight front-end GIS two-three-dimensional integrated display method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
郭琛琛: "基于真实地理信息的大规模三维地形生成渲染及应用", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
闫丰亭: "大规模三维地形建模与渲染的研究及改进", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Similar Documents

Publication Publication Date Title
CN107358649B (en) Processing method and device of terrain file
CN103946895B (en) The method for embedding in presentation and equipment based on tiling block
US20230074265A1 (en) Virtual scenario generation method and apparatus, computer device and storage medium
US20130100132A1 (en) Image rendering device, image rendering method, and image rendering program for rendering stereoscopic images
US20120133639A1 (en) Strip panorama
CN111784833A (en) WebGL-based flood evolution situation three-dimensional dynamic visualization display method
US20060132488A1 (en) Apparatus and method for representing multi-level LOD three-dimensional image
CN102411791B (en) Method and equipment for changing static image into dynamic image
CA2834575A1 (en) Method of rendering a terrain stored in a massive database
CN113196333A (en) Integration of variable rate shading and supersample shading
CN101477700A (en) Real tri-dimension display method oriented to Google Earth and Sketch Up
KR20080018404A (en) Computer readable recording medium having background making program for making game
Homer et al. Partitioning the conterminous United States into mapping zones for Landsat TM land cover mapping
CN104700455A (en) Method for visualizing three-dimensional data
CN113298927A (en) Data processing method, application interface display device and auxiliary operation method and device
CN101521828B (en) Implanted type true three-dimensional rendering method oriented to ESRI three-dimensional GIS module
JP5320576B1 (en) Highland leveling program, dynamic link library and landscape examination device
Izham et al. Influence of georeference for saturated excess overland flow modelling using 3D volumetric soft geo-objects
CN101488229A (en) PCI three-dimensional analysis module oriented implantation type ture three-dimensional stereo rendering method
KR100715669B1 (en) Device and method for representation of multi-level lod 3-demension image
CN114367113A (en) Method, apparatus, medium, and computer program product for editing virtual scene
CN106815359B (en) Vector map three-dimensional edge generation method based on mobile GIS platform
CN101482978B (en) ENVI/IDL oriented implantation type true three-dimensional stereo rendering method
CN101488232B (en) Implanted true three-dimension volumetric display method oriented to C Tech software
Sin et al. Planetary marching cubes: A marching cubes algorithm for spherical space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination