CN117357894A - Three-dimensional scene generation method, device, equipment and medium - Google Patents

Three-dimensional scene generation method, device, equipment and medium Download PDF

Info

Publication number
CN117357894A
CN117357894A CN202311440394.8A CN202311440394A CN117357894A CN 117357894 A CN117357894 A CN 117357894A CN 202311440394 A CN202311440394 A CN 202311440394A CN 117357894 A CN117357894 A CN 117357894A
Authority
CN
China
Prior art keywords
scene
data
voxel
game
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311440394.8A
Other languages
Chinese (zh)
Other versions
CN117357894B (en
Inventor
洪晓健
王楠楠
苟志远
高镜皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Amazgame Age Internet Technology Co ltd
Original Assignee
Beijing Amazgame Age Internet Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Amazgame Age Internet Technology Co ltd filed Critical Beijing Amazgame Age Internet Technology Co ltd
Priority to CN202311440394.8A priority Critical patent/CN117357894B/en
Publication of CN117357894A publication Critical patent/CN117357894A/en
Application granted granted Critical
Publication of CN117357894B publication Critical patent/CN117357894B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a three-dimensional scene generation method, a device, equipment and a medium, and relates to the technical field of computers, wherein the method comprises the following steps: acquiring game scene data of a game scene; voxel conversion is carried out on the game scene data to obtain voxel scene data; in response to adding the target voxel data to the voxel scene data, a three-dimensional scene is generated in which a target object corresponding to the target voxel data is placed in the game scene. Therefore, the reality degree and the fineness degree of the game scene can be improved by processing the game scene by a voxel method, and the phenomenon of clamping and stopping is not easy to occur in the subsequent role movement process. And the problem that the reality degree of the game scene is poor when the computer performance is poor or the game scene is large due to the fact that the game scene is built by using the physisx physical engine is avoided.

Description

Three-dimensional scene generation method, device, equipment and medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a medium for generating a three-dimensional scene.
Background
With the development of computer hardware level, various network game layers are endless. In order to promote the reality of scenes in the online games and enhance the favorability of players, more and more online games are evolving towards three-dimensional directions. The game of the home generation type is used as an important type of network game, and the playing method is not just two-dimensional placement, but three-dimensional construction.
In the related art, a physx physical engine is generally adopted to build a home scene. Building a home scene by a physx physical engine can enable object motions in the home scene to conform to the physical laws of the real world.
However, the physx physics engine also requires higher performance from the computer. When the performance of the computer is poor or the game scene is large, the reality degree of the game scene is poor, and the phenomenon of clamping and stopping easily occurs in the subsequent character movement process.
Disclosure of Invention
The application provides a three-dimensional scene generation method, device, equipment and medium, which can improve the reality degree of a game scene and are not easy to cause a clamping phenomenon in the subsequent role movement process.
The application discloses the following technical scheme:
in a first aspect, the present application provides a three-dimensional scene generating method, including: acquiring game scene data of a game scene;
voxel conversion is carried out on the game scene data to obtain voxel scene data;
in response to adding target voxel data to the voxel scene data, a three-dimensional scene is generated in which a target object corresponding to the target voxel data is placed in the game scene.
Optionally, the game scene data includes height information of the game scene, and voxel conversion is performed on the game scene data to obtain voxel scene data, including:
classifying the game scene data with the same characteristic height information to obtain M groups of equal-height game scene data, wherein M is a positive integer;
respectively carrying out voxelization conversion on the M groups of contour game scene data to obtain M groups of contour voxel scene data;
and integrating the M groups of contour voxel scene data to obtain voxel scene data.
Optionally, the generating, in response to adding target voxel data to the voxel scene data, a three-dimensional scene in which a target object corresponding to the target voxel data is placed in the game scene includes:
and in response to adding target voxel data into the voxel scene data, generating a three-dimensional scene in which a target object corresponding to the target voxel data is placed in the game scene with the direction data in the target voxel data as an orientation.
Optionally, the generating a three-dimensional scene for placing the target object corresponding to the target voxel data in the game scene includes:
and if the position data in the target voxel data is overlapped with the position data in the voxel scene data, generating a three-dimensional scene in which the target object corresponding to the target voxel data is placed in the game scene by only reading the position data in the voxel scene data.
Optionally, the method further comprises:
and in the three-dimensional scene, baking path-finding navigation data according to the voxel scene data and the target voxel data.
Optionally, in the three-dimensional scene, baking the route-seeking navigation data according to the voxel scene data and the target voxel data, including:
setting a path-finding jump point in the three-dimensional scene, wherein the path-finding jump point comprises one or more of an oblique upper jump point, an oblique lower jump point, a straight upper jump point, a straight down jump point and a parallel jump point;
and baking the route searching navigation data according to the route searching jump point, the voxel scene data and the target voxel data.
Optionally, the method further comprises:
and if the three-dimensional scene comprises a jagged scene, modifying the jagged scene into a smooth scene by connecting the first end and the second end of the jagged scene.
In a second aspect, the present application provides a three-dimensional scene generating apparatus, the apparatus comprising: the system comprises a first acquisition module, a second acquisition module and a scene generation module;
the first acquisition module is used for acquiring game scene data of a game scene;
the second acquisition module is used for carrying out voxel conversion on the game scene data to obtain voxel scene data;
the scene generation module is used for responding to adding target voxel data into the voxel scene data and generating a three-dimensional scene for placing a target object corresponding to the target voxel data in the game scene.
Optionally, the second obtaining module includes: a classification sub-module, a conversion sub-module and an integration sub-module;
the classifying sub-module is used for classifying the game scene data with the same characteristic height information to obtain M groups of equal-height game scene data, wherein M is a positive integer;
the conversion submodule is used for performing voxelization conversion on the M groups of contour game scene data respectively to obtain M groups of contour scene data;
and the integration submodule is used for integrating the M groups of contour voxel scene data to obtain voxel scene data.
Optionally, the scene generation module is specifically configured to: and in response to adding target voxel data into the voxel scene data, generating a three-dimensional scene in which a target object corresponding to the target voxel data is placed in the game scene with the direction data in the target voxel data as an orientation.
Optionally, the scene generation module is specifically configured to: and if the position data in the target voxel data is overlapped with the position data in the voxel scene data, generating a three-dimensional scene in which the target object corresponding to the target voxel data is placed in the game scene by only reading the position data in the voxel scene data.
Optionally, the apparatus further includes: a route baking module;
the route baking module is used for baking route-seeking navigation data according to the voxel scene data and the target voxel data in the three-dimensional scene.
Optionally, the route baking module includes: setting a submodule and a baking submodule;
the setting submodule is used for setting a path-finding jump point in the three-dimensional scene, wherein the path-finding jump point comprises one or more of an oblique upper jump point, an oblique lower jump point, a straight upper jump point, a straight down jump point and a parallel jump point;
the baking sub-module is used for baking the route searching navigation data according to the route searching jump point, the voxel scene data and the target voxel data.
Optionally, the apparatus further includes: a scene modifying module;
the scene modifying module is configured to modify the jaggy scene into a smooth scene by connecting a first end and a second end of the jaggy scene if the three-dimensional scene includes the jaggy scene.
In a third aspect, the present application provides a three-dimensional scene generating apparatus, including: a memory and a processor;
the memory is used for storing programs;
the processor is configured to implement the steps of the three-dimensional scene generating method when executing the computer program.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the three-dimensional scene generation method described above.
Compared with the prior art, the application has the following beneficial effects:
the application provides a three-dimensional scene generation method, a device, equipment and a medium, wherein the method comprises the following steps: acquiring game scene data of a game scene; voxel conversion is carried out on the game scene data to obtain voxel scene data; in response to adding the target voxel data to the voxel scene data, a three-dimensional scene is generated in which a target object corresponding to the target voxel data is placed in the game scene. Therefore, the reality degree and the fineness degree of the game scene can be improved by processing the game scene by a voxel method, and the phenomenon of clamping and stopping is not easy to occur in the subsequent role movement process. And the problem that the reality degree of the game scene is poor when the computer performance is poor or the game scene is large due to the fact that the game scene is built by using the physisx physical engine is avoided.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a flowchart of a three-dimensional scene generating method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of baked route guidance data according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of another baked routing navigation data provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of a jumping down eave according to an embodiment of the present disclosure;
FIG. 5 is a schematic illustration of climbing up a high wall according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a jump bridge according to an embodiment of the present application;
fig. 7 is a schematic diagram for solving a jagged scene according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a ray determination according to an embodiment of the present application;
FIG. 9 is a schematic diagram of another ray determination provided in an embodiment of the present application;
fig. 10 is a schematic diagram of a three-dimensional scene generating device according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a computer readable medium according to an embodiment of the present application;
fig. 12 is a schematic hardware structure of a server according to an embodiment of the present application.
Detailed Description
As described above, in the related art, a physx physical engine is generally used to build a home scene. Building a home scene by a physx physical engine can enable object motions in the home scene to conform to the physical laws of the real world.
However, the physx physics engine also requires higher performance from the computer. When the performance of the computer is poor or the game scene is large, the reality degree of the game scene is poor, and the phenomenon of clamping and stopping easily occurs in the subsequent character movement process.
In view of this, the present application discloses a three-dimensional scene generating method, device, equipment and medium, the method includes: the method comprises the following steps: acquiring game scene data of a game scene; voxel conversion is carried out on the game scene data to obtain voxel scene data; in response to adding the target voxel data to the voxel scene data, a three-dimensional scene is generated in which a target object corresponding to the target voxel data is placed in the game scene. Therefore, the reality degree and the fineness degree of the game scene can be improved by processing the game scene by a voxel method, and the phenomenon of clamping and stopping is not easy to occur in the subsequent role movement process. And the problem that the reality degree of the game scene is poor when the computer performance is poor or the game scene is large due to the fact that the game scene is built by using the physisx physical engine is avoided.
In order to make the present application solution better understood by those skilled in the art, the following description will clearly and completely describe the technical solution in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Referring to fig. 1, the figure is a flowchart of a three-dimensional scene generating method provided in an embodiment of the present application. The method comprises the following steps:
s101: game scene data of a game scene is acquired.
The game scene includes an environment, a building, props, etc. in the game. The game scene data may record relevant information in each part of the game scene, such as surrounding environment, building height, and use state of props.
S102: and carrying out voxelization conversion on the game scene data to obtain the voxel scene data.
After the game scene data is acquired, voxel conversion is needed to be carried out on the game scene data, so as to obtain voxel scene data. Wherein, voxel (voxel) is the abbreviation of volume element (voxel), which is the minimum unit of the game scene data divided in three-dimensional space.
In some specific implementations, the game scene data that represents the same height information may be categorized based on the height information in the game scene data. After classification, static voxel conversion is carried out on the classified game scene data, namely static voxel scene data is obtained. Voxel-based means that a solid height field is constructed from the source geometry to represent a space where no walking is possible, or where no target voxel data (i.e., target object) is possible. And integrating the static voxel scene data, and dynamically processing the integrated static voxel scene data to obtain dynamic voxel scene data.
In the three-dimensional scene generation method disclosed by the embodiment of the application, static voxel scene data is generated by a recast open source library. Since voxel scene data generated by the rect open source library contains a large number of pointers and consumes a large amount of memory, the heights of the upper and lower surfaces of all voxel blocks in the game scene data can be stored in a one-dimensional array short-span, namely, the game scene data is classified according to the height information. The voxel labels of all the voxel blocks are recorded in an unsigned char. The voxel mark refers to whether a voxel can walk, a voxel block type used for determining a path-finding weight, and the like. Subsequently, voxel scene data with coordinates (x, y) is recorded in index arr at subscripts of spandex, and the number of voxels on coordinates (x, y) is recorded in countArr. And, setting row as horizontal length, col as horizontal width, row×col as scene size. Then, the voxel scene data of the (x, y) position in the game scene data is obtained, that is, the index id of the voxel horizontal position can be obtained through the formula of x+y x row, then the index position of the first voxel of the position in the span Arr is obtained through the index Arr [ id ], and how much voxel scene data is obtained at the position through the countArr [ id ]. For dynamic processing of static voxel data, id obtained by the formula of x+y row on the voxel horizontal position can be used as an index, and a hash table (std: unorded_map < int, std: vector < rcdynamic voxeldata > motiofypap) is constructed, so that a stored dynamic array can be found through the index id, and the data can be directly added, deleted and checked, so that dynamic voxel scene data can be obtained.
S103: in response to adding the target voxel data to the voxel scene data, a three-dimensional scene is generated in which a target object corresponding to the target voxel data is placed in the game scene.
After the voxel scene data is obtained, dynamic change can be performed, and target voxel data is added. Specifically, the method for adding the target voxel data can be to add the corresponding target voxel data after voxelization of the building blocks into the voxel scene data in a building block mode, so that the building blocks are placed in the game scene. The building blocks can be a building, a brick and other objects with volumes, can be used as a block, and can also be used as a platform for standing and the like.
It can be understood that the target voxel data corresponding to the building blocks is generated by using the recast open source library as well as the voxel scene data, and is stored in the span, area, index Arr and countArr of the building blocks.
In some specific implementations, the target voxel data may include direction data. That is, the player can set the direction data in the target voxel data to define the orientation of the target object corresponding to the target voxel data, and place the target object in a fixed orientation in the three-dimensional scene in the game scene.
Further, even after the target object is placed in a three-dimensional scene in the game scene in a fixed orientation, the orientation of the target object can be adjusted by changing the orientation data. By way of example, the target object may be rotated in four directions (0 °, 90 °, 180 °, and 270 °) or freely rotated by any degree by a change in the direction data.
In some specific implementations, location data may also be included in the target voxel data. That is, the player can specify a specific position of the target object corresponding to the target voxel data in the game scene by setting the position data in the target voxel data.
It should be noted that, the position data of the target voxel data may overlap with the position data in the voxel scene data, and if the target object is forcibly placed at a predetermined position in the game scene, a mold penetration phenomenon may occur. If the position data in the target voxel data and the position data in the voxel scene data overlap, a three-dimensional scene in which the target object corresponding to the target voxel data is placed in the game scene is generated by reading only the position data in the voxel scene data. It is to be understood that a three-dimensional scene in which the target object corresponding to the target voxel data is placed in the game scene may be generated by reading only the position data of the target voxel data, which is not limited in this application.
In some specific implementations, after a three-dimensional scene is generated, the three-dimensional scene needs to be saved and assigned a unique version number. If the three-dimensional scene is adjusted every time later, for example, the target voxel data is added into the voxel scene data, another three-dimensional scene is generated, the other three-dimensional scene is saved, and a version number larger than the previous version number is given. If the player has a flash back phenomenon in the actual operation, the server can inquire the largest version number after reconnecting the player and display the three-dimensional scene corresponding to the largest version number so as to facilitate the player to play.
S104: in the three-dimensional scene, the route-seeking navigation data is baked according to the voxel scene data and the target voxel data.
After the three-dimensional scene is generated, a dtMeshTile is required to be generated in the three-dimensional scene at fixed intervals so as to quickly and accurately bake out the route-finding navigation data according to the voxel scene data and the target voxel data.
Referring to fig. 2, a schematic diagram of baking route guidance data according to an embodiment of the present application is shown. As shown, there is one dtMeshTile per fixed region in the three-dimensional scene, and each dtMeshTile may appear in a different number of triangular facets due to the complexity of the three-dimensional scene. And, in the triangular area, the route-finding navigation data can be baked.
It should be noted that even after the three-dimensional scene is generated, another target voxel data may be added again to voxel scene data corresponding to the three-dimensional scene to generate a new three-dimensional scene again. And, new route-seeking navigation data can be baked again based on the new three-dimensional scene.
Referring to fig. 3, a schematic diagram of another baked route guidance data according to an embodiment of the present application is shown. A in fig. 3 is an old triangular area, and old route-seeking navigation data can be baked in the old triangular area; b in fig. 3 is a new triangular area after a new target object (cube in the figure) is added, and new route-seeking navigation data can be baked in the new triangular area. And, since the area of the target object is not a triangular area, characterizing new route-seeking navigation data requires avoiding the target object.
In some specific implementations, the baked out wayfinding navigation data may also be correlated with the wayfinding jump point. Wherein a pair of points that jump from a start point to an end point is called a jump point. Illustratively, the seek jump points include one or more of an oblique jump point, an oblique lower jump point, a straight jump point, and a parallel jump point, for example, if the player wants to jump the character to a building, the oblique jump point needs to be set, and if the player wants to jump the character to a bridge cut-off, the parallel jump point needs to be set.
And, the hops are also classified as one-way passing hops or two-way interworking hops. If the hop is a bidirectional intercommunication hop, the hop can also be used as a transmission. For example, if the user wants to transfer the character directly from the point 1 having the height a to the point 2 having the height a+1, the voxel scene data obtained in the step S102 may be used to determine the difference in height between the point 1 and the point 2 (i.e., whether the difference is upward, downward or parallel), and then jump points corresponding to the difference in height may be set in the point 1 and the point 2, respectively, to transfer the character.
Referring to fig. 4, a schematic diagram of a jumping down eave according to an embodiment of the present application is shown. From the voxel scene data obtained in step S102, it can be determined that there is a height difference from the eave to the ground. Then the corresponding position on the eave can be taken as the starting point of the road-finding jump point, the position corresponding to the ground is taken as the end point of the road-finding jump point, and the position is taken as a pair of unidirectional oblique downward jump points which jump from the eave.
It should be noted that, the skip points cannot be set for every data of the individual data without limitation, and if the skip points are too dense, the performance requirement of the game will be increased. Therefore, the method can prevent the jumping points in the same direction from being existed in the area around the radius of the character around one jumping point, thereby reducing the density of the jumping points through the sparse processing, reducing the memory consumption of the jumping point storage and not affecting the actual effect of the road searching.
Referring to fig. 5, a schematic view of climbing up a high wall according to an embodiment of the present application is shown. High walls differ from eaves in that they are generally straight up and down, so the hops should be set as straight up and down hops.
Referring to fig. 6, a schematic diagram of a jump bridge according to an embodiment of the present application is shown. Through the voxel scene data obtained in the step S102, it can be determined that there is no height difference between the broken bridges, and then the corresponding position at the left end of the broken bridge can be taken as a starting point of the road-finding jump point, and the corresponding position at the right end of the broken bridge can be taken as an ending point of the road-finding jump point, as a pair of bidirectional horizontal jump points.
S105: if the three-dimensional scene comprises a jagged scene, the jagged scene is modified into a smooth scene by connecting the first end and the second end of the jagged scene.
After the three-dimensional scene is generated, a scene in which walls, rivers, etc. should be smooth may show jaggies. Then the jagged scene may be modified to a smooth scene. The present application is not limited to a specific smoothing object, and only a smoothing method is described here.
Referring to fig. 7, a schematic diagram for solving a jagged scene is provided in an embodiment of the present application. As shown, a slope may be filled in a corner of the saw-tooth scene to reduce abrupt change, and thus, saw teeth become smooth slopes, and the game screen is no longer jittered. Alternatively, the first and second ends of the jagged scene may be directly connected, thereby modifying the jagged scene to a smooth scene.
In some specific implementations, in actual operation of the character, a shake problem may also occur when walking uphill and downhill. The essence is that the real height in the game scene data needs to be restored because the game scene data becomes discrete in the form of voxels and has low precision. Specifically, after recording the offset or offset ratio between the height information of a certain voxel scene data and the height information of the corresponding game scene data, the offset or offset ratio may be added to all voxel scene data as an additional reference value.
S106: the character moves in the three-dimensional scene according to a movement route generated by the route finding navigation data.
It should be noted that, while the character moves, a ray judgment is also required to detect whether the character collides with an object in the three-dimensional scene.
In some specific implementations, multiple nodes may be constructed from voxel scene data and a determination may be made for each node as to whether a voxel is present within the surrounding area. If no voxels exist in the surrounding area, the ray judgment of the surrounding area can be directly skipped, so that the number of ray judgment times is reduced, and the ray judgment efficiency is improved.
Referring to fig. 8, a schematic diagram of a ray determination according to an embodiment of the present application is shown. From the figure, if there are no voxels in the regions 1 and 2 and the rays do not pass through the regions 3 and 4, the ray judgment of the regions 3 and 4 can be directly skipped.
In other specific implementations, horizontal rays may also be emitted. If no voxels exist on the same level, the ray judgment of the level can be directly skipped, so that the number of ray judgment times is reduced, and the ray judgment efficiency is improved.
Referring to fig. 9, a schematic diagram of another ray determination provided in an embodiment of the present application is shown. If a horizontal ray is emitted at the level of [ n, n+1], then the ray determination for that height range can be skipped directly because there are no voxels in that height range. If a horizontal ray is emitted at the level of [ n-1, n ] and voxels exist in the height range, whether the voxels exist in the [0,2] range and the voxels exist in the [3,5] range can be judged again, so that the number of ray judgment times is reduced, and the ray judgment efficiency is improved.
In summary, the present application discloses a three-dimensional scene generating method, which includes: acquiring game scene data of a game scene; voxel conversion is carried out on the game scene data to obtain voxel scene data; in response to adding the target voxel data to the voxel scene data, a three-dimensional scene is generated in which a target object corresponding to the target voxel data is placed in the game scene. Therefore, the reality degree and the fineness degree of the game scene can be improved by processing the game scene by a voxel method, and the phenomenon of clamping and stopping is not easy to occur in the subsequent role movement process. And the problem that the reality degree of the game scene is poor when the computer performance is poor or the game scene is large due to the fact that the game scene is built by using the physisx physical engine is avoided.
Referring to fig. 10, a schematic diagram of a three-dimensional scene generating device according to an embodiment of the present application is disclosed. The three-dimensional scene generating apparatus 200 includes: a first acquisition module 201, a second acquisition module 202, and a scene generation module 203;
specifically, the first obtaining module 201 is configured to obtain game scene data of a game scene; a second obtaining module 202, configured to voxel convert the game scene data to obtain voxel scene data; the scene generation module 203 is configured to generate a three-dimensional scene in which a target object corresponding to the target voxel data is placed in the game scene in response to adding the target voxel data to the voxel scene data.
In some specific implementations, the second acquisition module 202 includes: a classification sub-module, a conversion sub-module and an integration sub-module;
specifically, the classifying sub-module is used for classifying the game scene data with the same characteristic height information to obtain M groups of equal-height game scene data, wherein M is a positive integer; the conversion sub-module is used for performing voxel conversion on the M groups of contour game scene data respectively to obtain M groups of contour scene data; and the integration sub-module is used for integrating the M groups of contour voxel scene data to obtain voxel scene data.
In some specific implementations, the scene generation module 203 is specifically configured to: in response to adding the target voxel data to the voxel scene data, a three-dimensional scene is generated in which a target object corresponding to the target voxel data is placed in the game scene with the direction data in the target voxel data as an orientation.
In some specific implementations, the scene generation module 203 is specifically configured to: if the position data in the target voxel data coincides with the position data in the voxel scene data, a three-dimensional scene in which a target object corresponding to the target voxel data is placed in the game scene is generated by reading only the position data in the voxel scene data.
In some specific implementations, the three-dimensional scene generating device 200 further includes: a route baking module; specifically, the route baking module is used for baking the route searching navigation data according to the voxel scene data and the target voxel data in the three-dimensional scene.
In some specific implementations, the route baking module includes: setting a submodule and a baking submodule;
specifically, a sub-module is arranged for setting a path-finding jump point in the three-dimensional scene, wherein the path-finding jump point comprises one or more of an oblique upper jump point, an oblique lower jump point, a straight upper jump point, a straight lower jump point and a parallel jump point; and the baking sub-module is used for baking the route searching navigation data according to the route searching jump point, the voxel scene data and the target voxel data.
In some specific implementations, the three-dimensional scene generating device 200 further includes: a route modification module; specifically, if the three-dimensional scene includes a saw-tooth scene, the route modification module is configured to modify the saw-tooth scene into a smooth scene by connecting the first end and the second end of the saw-tooth scene.
In summary, the application discloses a three-dimensional scene generating device, which processes a game scene by a voxel method, can improve the reality degree and the fineness degree of the game scene, and is not easy to cause a clamping phenomenon in the subsequent role movement process. And the problem that the reality degree of the game scene is poor when the computer performance is poor or the game scene is large due to the fact that the game scene is built by using the physisx physical engine is avoided.
Referring to fig. 11, a schematic diagram of a computer readable medium according to an embodiment of the present application is provided. The computer readable medium 300 has stored thereon a computer program 311, which computer program 311, when executed by a processor, implements the steps of the three-dimensional scene generation method of fig. 1 described above.
It should be noted that in the context of this application, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the machine-readable medium described in the present application may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal that propagates in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
Referring to fig. 12, which is a schematic diagram of a hardware structure of a server according to an embodiment of the present application, the server 400 may have a relatively large difference due to configuration or performance, and may include one or more central processing units (centralprocessing units, CPU) 422 (e.g., one or more processors) and a memory 432, and one or more storage media 430 (e.g., one or more mass storage devices) storing application programs 440 or data 444. Wherein memory 432 and storage medium 430 may be transitory or persistent storage. The program stored on the storage medium 430 may include one or more modules (not shown), each of which may include a series of instruction operations on a server. Still further, the central processor 422 may be configured to communicate with the storage medium 430 and execute a series of instruction operations in the storage medium 430 on the server 400.
The server 400 may also include one or more power supplies 426, one or more wired or wireless network interfaces 450, one or more input/output interfaces 458, and/or one or more operating systems 441, such as Windows ServerTM, mac OS XTM, unixTM, linuxTM, freeBSDTM, and the like.
The steps performed by the three-dimensional scene generation method in the above-described embodiment may be based on the server structure shown in fig. 12.
It should also be noted that, according to an embodiment of the present application, the process of the three-dimensional scene generating method described in the flowchart of fig. 1 may be implemented as a computer software program. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow diagram of fig. 1 described above.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.
While several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present application. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
The foregoing description is only of the preferred embodiments of the present application and is presented as a description of the principles of the technology being utilized. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this application is not limited to the specific combinations of features described above, but it is intended to cover other embodiments in which any combination of features described above or equivalents thereof is possible without departing from the spirit of the disclosure. Such as the above-described features and technical features having similar functions (but not limited to) disclosed in the present application are replaced with each other.

Claims (10)

1. A method of generating a three-dimensional scene, the method comprising:
acquiring game scene data of a game scene;
voxel conversion is carried out on the game scene data to obtain voxel scene data;
in response to adding target voxel data to the voxel scene data, a three-dimensional scene is generated in which a target object corresponding to the target voxel data is placed in the game scene.
2. The method of claim 1, wherein the game scene data includes altitude information of the game scene, and wherein voxel conversion of the game scene data results in voxel scene data, comprising:
classifying the game scene data with the same characteristic height information to obtain M groups of equal-height game scene data, wherein M is a positive integer;
respectively carrying out voxelization conversion on the M groups of contour game scene data to obtain M groups of contour voxel scene data;
and integrating the M groups of contour voxel scene data to obtain voxel scene data.
3. The method of claim 1, wherein the generating a three-dimensional scene in which a target object corresponding to target voxel data is placed in the game scene in response to adding the target voxel data to the voxel scene data comprises:
and in response to adding target voxel data into the voxel scene data, generating a three-dimensional scene in which a target object corresponding to the target voxel data is placed in the game scene with the direction data in the target voxel data as an orientation.
4. The method of claim 1, wherein the generating a three-dimensional scene that places a target object corresponding to the target voxel data in the game scene comprises:
and if the position data in the target voxel data is overlapped with the position data in the voxel scene data, generating a three-dimensional scene in which the target object corresponding to the target voxel data is placed in the game scene by only reading the position data in the voxel scene data.
5. The method according to claim 1, wherein the method further comprises:
and in the three-dimensional scene, baking path-finding navigation data according to the voxel scene data and the target voxel data.
6. The method of claim 5, wherein said baking the wayfinding navigation data in the three-dimensional scene from the voxel scene data and the target voxel data comprises:
setting a path-finding jump point in the three-dimensional scene, wherein the path-finding jump point comprises one or more of an oblique upper jump point, an oblique lower jump point, a straight upper jump point, a straight down jump point and a parallel jump point;
and baking the route searching navigation data according to the route searching jump point, the voxel scene data and the target voxel data.
7. The method according to claim 1, wherein the method further comprises:
and if the three-dimensional scene comprises a jagged scene, modifying the jagged scene into a smooth scene by connecting the first end and the second end of the jagged scene.
8. A three-dimensional scene generation apparatus, the apparatus comprising: the system comprises a first acquisition module, a second acquisition module and a scene generation module;
the first acquisition module is used for acquiring game scene data of a game scene;
the second acquisition module is used for carrying out voxel conversion on the game scene data to obtain voxel scene data;
the scene generation module is used for responding to adding target voxel data into the voxel scene data and generating a three-dimensional scene for placing a target object corresponding to the target voxel data in the game scene.
9. A three-dimensional scene generating apparatus, characterized by comprising: a memory and a processor;
the memory is used for storing programs;
the processor being adapted to execute the program to carry out the steps of the method according to any one of claims 1 to 7.
10. A computer storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method according to any of claims 1 to 7.
CN202311440394.8A 2023-11-01 2023-11-01 Three-dimensional scene generation method, device, equipment and medium Active CN117357894B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311440394.8A CN117357894B (en) 2023-11-01 2023-11-01 Three-dimensional scene generation method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311440394.8A CN117357894B (en) 2023-11-01 2023-11-01 Three-dimensional scene generation method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN117357894A true CN117357894A (en) 2024-01-09
CN117357894B CN117357894B (en) 2024-03-29

Family

ID=89396391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311440394.8A Active CN117357894B (en) 2023-11-01 2023-11-01 Three-dimensional scene generation method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117357894B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109701273A (en) * 2019-01-16 2019-05-03 腾讯科技(北京)有限公司 Processing method, device, electronic equipment and the readable storage medium storing program for executing of game data
CN112973127A (en) * 2021-03-17 2021-06-18 北京畅游创想软件技术有限公司 Game 3D scene editing method and device
WO2022193612A1 (en) * 2021-03-16 2022-09-22 天津亚克互动科技有限公司 Motion processing method and apparatus for game character, and storage medium and computer device
CN115294256A (en) * 2022-08-16 2022-11-04 北京畅游创想软件技术有限公司 Hair processing method, device, electronic equipment and computer readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109701273A (en) * 2019-01-16 2019-05-03 腾讯科技(北京)有限公司 Processing method, device, electronic equipment and the readable storage medium storing program for executing of game data
WO2022193612A1 (en) * 2021-03-16 2022-09-22 天津亚克互动科技有限公司 Motion processing method and apparatus for game character, and storage medium and computer device
CN112973127A (en) * 2021-03-17 2021-06-18 北京畅游创想软件技术有限公司 Game 3D scene editing method and device
CN115294256A (en) * 2022-08-16 2022-11-04 北京畅游创想软件技术有限公司 Hair processing method, device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN117357894B (en) 2024-03-29

Similar Documents

Publication Publication Date Title
US10593110B2 (en) Method and device for computing a path in a game scene
US11270497B2 (en) Object loading method and apparatus, storage medium, and electronic device
CN109701273B (en) Game data processing method and device, electronic equipment and readable storage medium
CN103442331B (en) Terminal unit location determining method and terminal unit
US8214527B2 (en) Fast algorithm for peer-to-peer shortest path problem
US10288437B2 (en) Routing with data version stitching
CN106355427B (en) Shopping guide ground drawing generating method and device
WO2022127258A1 (en) Path searching method and apparatus based on game scene, device and medium
JP6380051B2 (en) Finite element arithmetic program, finite element arithmetic device, and finite element arithmetic method
CN115937439B (en) Method and device for constructing three-dimensional model of urban building and electronic equipment
CN109155846B (en) Three-dimensional reconstruction method and device of scene, electronic equipment and storage medium
RU2296368C2 (en) Method for cutting off a line and method for displaying three-dimensional image based on this method
CN117357894B (en) Three-dimensional scene generation method, device, equipment and medium
CN109657088A (en) A kind of picture risk checking method, device, equipment and medium
CN113205601A (en) Roaming path generation method and device, storage medium and electronic equipment
CN111158881B (en) Data processing method, device, electronic equipment and computer readable storage medium
CN107872671B (en) Picture coding method and terminal
KR20190050575A (en) Flying path searching method for unmanned aerial vehicle
CN113413601B (en) Road searching method and device
KR101001844B1 (en) System and method for genrating cloaking area to cloak position inforamtion of user in location based service
CN108804625A (en) A kind of optimization method, device and the computer equipment of LSM trees
CN111524446B (en) Data processing method and device, electronic equipment and readable storage medium
KR100540899B1 (en) A method for compressing geographic information data
CN116127792B (en) Interpolation method and device for scattered data
CN113521741B (en) Method, device, equipment and storage medium for automatically generating map area links

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant