WO2021237785A1 - Procédé de cuisson et de découpe de grande carte et procédé de récupération - Google Patents

Procédé de cuisson et de découpe de grande carte et procédé de récupération Download PDF

Info

Publication number
WO2021237785A1
WO2021237785A1 PCT/CN2020/094672 CN2020094672W WO2021237785A1 WO 2021237785 A1 WO2021237785 A1 WO 2021237785A1 CN 2020094672 W CN2020094672 W CN 2020094672W WO 2021237785 A1 WO2021237785 A1 WO 2021237785A1
Authority
WO
WIPO (PCT)
Prior art keywords
fragmented
plot
cutting
file
baking
Prior art date
Application number
PCT/CN2020/094672
Other languages
English (en)
Chinese (zh)
Inventor
郭耀琦
Original Assignee
广州尊游软件科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州尊游软件科技有限公司 filed Critical 广州尊游软件科技有限公司
Publication of WO2021237785A1 publication Critical patent/WO2021237785A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player

Definitions

  • the present invention relates to the field of information technology, and more specifically, to a large map baking and cutting method and a restoration method.
  • the world map scene In existing games and some special fields such as architectural modeling, a very large world map scene needs to be used to simulate the real or virtual scene to be displayed.
  • the world map scene generally has the following characteristics: a large surface area, with various resources, such as rivers, trees, cliffs, buildings, vehicles and so on.
  • the most basic solution is to use the terrain editor provided by the engine used in the game to create a complete surface, and load the complete terrain when the game is played.
  • the surface texture memory occupies a large amount, which is very easy to cause excessive memory and crash.
  • the map is divided into multiple levels of detail according to the distance between the map and the virtual camera.
  • the level of detail is baked and rendered in blocks, and the baking information corresponding to each level of detail is saved according to the specified accuracy.
  • different baking information files are exported to improve the rendering performance when the camera performs the near-to-far transformation, but the map of the same size is still used, but different baking information is used for rendering, which does not solve the problem of large memory usage.
  • the present invention provides a large map baking and cutting method and a restoration method.
  • a large map baking and cutting method including the following steps:
  • step S3 For each of the fragmented plots, bake the color of the fragmented plot to a texture, save the texture as a texture file, create a material corresponding to the texture, and assign a shader file according to the cutting parameters of step S1;
  • the principle of the above method is: determine and cut the source ground surface according to the cutting parameters into fragmented plots, then convert it into a grid and save it as a plot file; then add the color of the fragmented plot to the two-dimensional texture and save the texture file , And then create the material of the fragmented plot in the form of a corresponding map, and assign the appropriate shader file artificially or automatically according to the cutting parameters; finally, output the corresponding parameters of the fragmented plot, the path of each file, and the exported material to the final Among the cutting parameter files, the cutting parameter file is used for recovery.
  • step S4 the following steps are included:
  • step S1 is specifically as follows:
  • the terrain editor cuts several fragments according to the number of fragments and the size of the fragments, and outputs them for baking Based on the origin, the fragments are individually illuminated and baked into the overall preview of the fragments. If the operator judges that the overall preview of the fragments shows that the connected fragments do not match the source surface, return to the set number of fragments and fragments Plot size and output the overall preview of the fragmented plot, otherwise, go to step S2.
  • the source surface includes a fixed object.
  • the terrain editor cuts several fragments according to the number of fragments and the size of the fragments, it first determines the fragments to be cut according to the cutting parameters, and then according to the Fix the position of the fixed object, determine whether the fixed object will be cut on more than one multiple fragment plots, if so, calculate the first fragment plot occupying the largest area of the fixed object among the multiple fragment plots, and fix it
  • the objects are included in the first fragment plot and the fixed objects are removed from the other fragment plots; the above operation of judging the fixed objects is repeated until all the fixed objects are only included in a single fragment plot, and the fragment plots are output.
  • the texture is an RGB texture whose size can be customized.
  • the parameters of the fragmented plot include the vertex coordinates, size, and position coordinates of the fragmented plot on the map.
  • fragmented plot files in step S2 and the texture files in step S3 are named and arranged in sequence.
  • the color of the fragmented plot in the step S3 is a solid color formed by a combination of the original color and the color baked by light.
  • a recovery method based on a large map baking and cutting method Firstly, initialize the cutting parameter file obtained in step S4, load the location coordinates of the fragmented plot, cache the path of the fragmented plot file and the map file; real-time check the location of the fragmented plot Calculate the coordinates to determine whether the fragmented plot falls into the range that needs to be displayed; if the fragmented plot falls into the range that needs to be displayed and has not been loaded, it will be constructed and displayed based on the cached fragmented plot file and texture file. Prefab; if the fragmented plot does not fall into the range that needs to be displayed and has been loaded for more than a preset period of time, unload the fragmented plot.
  • the specific process of judging whether the fragmented plot falls within the range that needs to be displayed is to calculate all the fragmented plots before rendering each frame and determine whether any vertex of the fragmented plot will fall on the screen in the next frame In the range.
  • the specific process of judging whether the fragmented plot falls within the range that needs to be displayed is to determine the coordinates of the center point of the screen in the next frame on the map before each frame is rendered. If a vertex is located within the preset radius of the center point in the next frame, the fragmented plot falls into the range that needs to be displayed.
  • the present invention first determines the cutting parameters to ensure that the cut fragments can still achieve the same baking effect as the source surface after being baked by independent lighting, and then cut the source surface into a grid of fragments, and then according to the fragments
  • the entity color corresponding to the source surface is baked onto the texture map, the grid and texture files of the fragmented plots are saved, and the cutting parameter file is output according to the preset cutting parameters and texture materials.
  • first load the cutting parameter file and then load the resources of the fragmented parcels within the screen range according to the fragmented parcel file and texture file on the screen.
  • the operator only needs to adjust in advance and ensure that the recombination of the fragmented plots can achieve the same effect as the source surface, and the specific operation of cutting the source surface will be automatically performed on the basis of the cutting parameters, and the fragmented plots after cutting Related files can be loaded on demand and do not need to be light-baked again to achieve the purpose of optimizing performance.
  • Fig. 1 is a schematic flow diagram of the large-map baking and cutting method of the present invention.
  • Figure 2 shows the interface of the map editor and its setting parameters.
  • Figure 3 is a schematic diagram of the assembled prefab on a map.
  • Figure 4 is a schematic diagram of the actual scene of the game.
  • a method for baking and cutting a large map as shown in Figure 1.
  • several fragments will be automatically cut through T4M in Unity3D, and the fragments will be light baked based on the baking origin, and output The overall preview of the fragments. If the operator judges that the overall preview of the fragments shows that the connected fragments do not match the source surface, then return to the set number of fragments and the size of the fragments and output the overall preview of the fragments, otherwise
  • the overall preview is a 3D free preview with a free lens or a screenshot determined by the angle of view.
  • the cutting parameters may also include other parameters, such as the naming rules of the fragmented plot files and texture files to be generated next.
  • T4M when T4M cuts fragments, it first determines the fragments to be cut according to the cutting parameters, and then determines whether the fixed object will be cut on more than one fragments based on the position of the fixed object. If so, Calculate the first fragment plot that occupies the largest area of the fixed object among the multiple fragment plots, include the fixed object in the first fragment plot and remove the fixed object from the other fragment plots; repeat the above-mentioned operation of determining the fixed object , Until all fixed objects are only included in a single fragment plot, the fragment plot is output. In the modified implementation, algorithms can also be used to determine whether the edges between fragmented plots are smooth.
  • it can be calculated to determine whether the topography, lighting, color transitions on both sides of the edge are smooth, whether there are faults, and the topography or Whether the light coverage generated by the fixed object is also reflected on the debris plot, so as to also achieve the effect of judging whether the debris plot is consistent with the source surface.
  • the source surface is cut into a grid of several fragmented plots and saved as a fragmented plot file.
  • the physical colors of the fragmented plots onto an RGB texture set by the operator, save the texture as a texture file, and create a texture corresponding to the texture, according to the step S1 Cutting parameter allocation shader file.
  • the entity color is a combination of the original color and the color baked by light.
  • a cutting parameter file including the parameters of all fragmented plots, the paths of fragmented plot files and texture files, and the path of material and shader files.
  • the parameters of the fragment plot include the vertex coordinates, size, and position coordinates of the fragment plot on the map.
  • the game engine can re-import the texture file and cutting parameter file in advance, assign the textures to the materials corresponding to the textures in order, read the fragmented plot file of step S2 and load it
  • the grid of fragmented plots uses materials and fragmented plots to form a prefabricated body of fragmented plots with fixed objects, such as the reusable object Prefab in Unity3D.
  • Prefab corresponds to the grid between the source surface. For example, if a map part is cut into 5 rows and 5 columns, it will result in 25 Prefabs.
  • a recovery method based on the baking and cutting method of the big map First, the cutting parameter file obtained when the cutting method is completed, load the location coordinates of the fragmented plot, cache the path of the fragmented plot file and the texture file; real-time check the location of the fragmented plot Calculate the coordinates to determine whether the fragmented plot falls into the range that needs to be displayed; if the fragmented plot falls into the range that needs to be displayed and has not been loaded, it will be constructed and displayed based on the cached fragmented plot file and texture file. Prefab; if the fragmented plot does not fall into the range that needs to be displayed and has been loaded for more than a preset period of time, unload the fragmented plot.
  • the fragmented plot since the fragmented plot includes fixed objects that have been allocated, the surface of the fragmented plot and the fixed objects are loaded together when they need to be displayed.
  • the specific process of judging whether the fragmented plot falls within the range that needs to be displayed is to calculate all the fragmented plots and determine whether the fragmented plots have any bounding before each frame of rendering. ) The vertex will fall within the range of the screen in the next frame.
  • the specific process of judging whether the fragmented plot falls within the range that needs to be displayed is to determine the coordinates of the center point of the screen in the next frame on the map before each frame is rendered, if If any bounding apex of the fragment plot is located within the preset radius of the center point in the next frame, the fragment plot falls into the range that needs to be displayed.
  • the existing technology requires 4 rgba maps and 4 rgb maps to form a control map plot, and at 2k resolution without compression
  • the size of the rgba map is 16mb
  • the plot after cutting is generally 256kb, and the maximum is not more than 3-5mb. This can greatly reduce the memory, and the user slides the screen to make the map When the different parts of the are loaded correspondingly, the display effect consistent with the prior art is still maintained.

Abstract

La présente invention concerne un procédé de cuisson et de découpe de grande carte et un procédé de récupération. Le procédé de découpe comprend les étapes suivantes : S1, acquisition d'une surface de sol source, et détermination de paramètres de découpe ; S2, découpe de la surface de sol source en un maillage de plusieurs blocs fragmentés et stockage de ceux-ci sous la forme d'un fichier de blocs fragmentés ; S3, pour chaque bloc fragmenté, cuisson de la couleur du bloc fragmenté sur une carte, stockage de la carte sous la forme d'un fichier de carte, création d'un matériau correspondant à la carte, et attribution d'un fichier de nuanceur en fonction des paramètres de découpe à l'étape S1 ; et S4, délivrance en sortie d'un fichier de paramètres de découpe comprenant les paramètres de tous les blocs fragmentés, le fichier de blocs fragmentés et un chemin et le matériau du fichier de carte, et le chemin du fichier de nuanceur. La présente invention présente un avantage en ce que l'opération spécifique de découpe de la surface de sol source sera automatiquement exécutée sur la base des paramètres de découpe prédéterminés, et les fichiers pertinents des blocs fragmentés découpés peuvent être chargés à la demande lors de l'utilisation, de façon à réaliser l'objectif d'optimisation des performances.
PCT/CN2020/094672 2020-05-26 2020-06-05 Procédé de cuisson et de découpe de grande carte et procédé de récupération WO2021237785A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010455886.4A CN111729303B (zh) 2020-05-26 2020-05-26 一种大地图烘焙切割方法及恢复方法
CN202010455886.4 2020-05-26

Publications (1)

Publication Number Publication Date
WO2021237785A1 true WO2021237785A1 (fr) 2021-12-02

Family

ID=72648156

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/094672 WO2021237785A1 (fr) 2020-05-26 2020-06-05 Procédé de cuisson et de découpe de grande carte et procédé de récupération

Country Status (2)

Country Link
CN (1) CN111729303B (fr)
WO (1) WO2021237785A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004329682A (ja) * 2003-05-09 2004-11-25 Nintendo Co Ltd ゲームシステムおよびゲームプログラム
CN105321200A (zh) * 2015-07-10 2016-02-10 苏州蜗牛数字科技股份有限公司 离线渲染的预处理方法
CN106075907A (zh) * 2016-06-02 2016-11-09 苏州乐米信息科技有限公司 一种游戏地图的编辑方法
CN107358649A (zh) * 2017-06-07 2017-11-17 腾讯科技(深圳)有限公司 地形文件的处理方法和装置
CN108109204A (zh) * 2017-12-18 2018-06-01 苏州蜗牛数字科技股份有限公司 一种制作和渲染大规模地形的方法及系统
CN109364483A (zh) * 2018-10-10 2019-02-22 苏州好玩友网络科技有限公司 大场景地图分割方法及应用其的玩家视角场景更新方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL184306C (nl) * 1985-10-02 1989-06-16 Goede Houdstermaatschappij Spuitinrichting voor het uit een schuimmassa vervaardigen van bakkerijprodukten.
JP2006162794A (ja) * 2004-12-03 2006-06-22 Konica Minolta Photo Imaging Inc 光学デバイスの製造方法、光学デバイス、映像表示装置およびヘッドマウントディスプレイ
CN1842003A (zh) * 2005-03-30 2006-10-04 广州市领华科技有限公司 实现在单一对话窗口与多个联系人即时通讯的方法
CN105931286A (zh) * 2016-03-29 2016-09-07 赵岳生 三维视景仿真gis地理信息系统中的地形阴影实时仿真方法
CN110956673A (zh) * 2018-09-26 2020-04-03 北京高德云图科技有限公司 一种地图绘制方法及装置
CN110503719A (zh) * 2019-08-21 2019-11-26 山西新华电脑职业培训学校 一种vr游戏设计方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004329682A (ja) * 2003-05-09 2004-11-25 Nintendo Co Ltd ゲームシステムおよびゲームプログラム
CN105321200A (zh) * 2015-07-10 2016-02-10 苏州蜗牛数字科技股份有限公司 离线渲染的预处理方法
CN106075907A (zh) * 2016-06-02 2016-11-09 苏州乐米信息科技有限公司 一种游戏地图的编辑方法
CN107358649A (zh) * 2017-06-07 2017-11-17 腾讯科技(深圳)有限公司 地形文件的处理方法和装置
CN108109204A (zh) * 2017-12-18 2018-06-01 苏州蜗牛数字科技股份有限公司 一种制作和渲染大规模地形的方法及系统
CN109364483A (zh) * 2018-10-10 2019-02-22 苏州好玩友网络科技有限公司 大场景地图分割方法及应用其的玩家视角场景更新方法

Also Published As

Publication number Publication date
CN111729303B (zh) 2024-04-05
CN111729303A (zh) 2020-10-02

Similar Documents

Publication Publication Date Title
US20210027525A1 (en) Forward rendering pipeline with light culling
EP1754196B1 (fr) Gestion des ressources pour creation procedurale de terrain a partir de regles
EP1754199B1 (fr) Outil d'edition de terrain pour la generation procedurielle de terrain d'apres des regles
EP1763846B1 (fr) Generation procedurale de terrain a base de regles
US8004518B2 (en) Combined spatial index for static and dynamic objects within a three-dimensional scene
US8253730B1 (en) System and method for construction of data structures for ray tracing using bounding hierarchies
US20100289799A1 (en) Method, system, and computer program product for efficient ray tracing of micropolygon geometry
US8725466B2 (en) System and method for hybrid solid and surface modeling for computer-aided design environments
US10665010B2 (en) Graphics processing systems
CN106485776A (zh) 一种3d游戏实时渲染大规模场景的方法及系统
KR20080018404A (ko) 게임 제작을 위한 배경 제작 프로그램을 저장한 컴퓨터에서읽을 수 있는 기록매체
WO2008037615A1 (fr) Répartition de la charge de travail dans un système de traitement d'image basé sur le lancer de rayon
CN112419498A (zh) 一种海量倾斜摄影数据的调度渲染方法
CN116228960A (zh) 虚拟博物馆展示系统的构建方法、构建系统和展示系统
WO2021237785A1 (fr) Procédé de cuisson et de découpe de grande carte et procédé de récupération
CN116883572B (zh) 一种渲染方法、装置、设备及计算机可读存储介质
Olsson et al. Efficient real-time shading with many lights
Hoppe et al. Adaptive meshing and detail-reduction of 3D-point clouds from laser scans
CN117688635A (zh) 基于射线追踪的bim模型脱壳方法、系统及介质
CN112001996A (zh) 一种基于运行时纹理重组的三维模型实时渲染方法
CN115187720A (zh) 一种基于数字地形分析优化地形渲染的方法
Douglas Diffuse Global Illumination via Direct and Virtual Indirect Light Sources
Koca Representation, editing and real-time visualization of complex 3D terrains
Seletsky Real Time Visibility Culling with Hardware Occlusion Queries and Uniform Grids
Ahrens et al. Ray Tracing Dynamic Scenes with Shadows on the GPU

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20937275

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20937275

Country of ref document: EP

Kind code of ref document: A1