WO2020038441A1 - 地图渲染方法、装置、计算机设备及存储介质 - Google Patents

地图渲染方法、装置、计算机设备及存储介质 Download PDF

Info

Publication number
WO2020038441A1
WO2020038441A1 PCT/CN2019/102035 CN2019102035W WO2020038441A1 WO 2020038441 A1 WO2020038441 A1 WO 2020038441A1 CN 2019102035 W CN2019102035 W CN 2019102035W WO 2020038441 A1 WO2020038441 A1 WO 2020038441A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
map
block
rendering
terminal
Prior art date
Application number
PCT/CN2019/102035
Other languages
English (en)
French (fr)
Inventor
邵岳伟
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2020550861A priority Critical patent/JP7085012B2/ja
Priority to EP19852334.2A priority patent/EP3753614B1/en
Publication of WO2020038441A1 publication Critical patent/WO2020038441A1/zh
Priority to US17/017,520 priority patent/US11852499B2/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3811Point data, e.g. Point of Interest [POI]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3885Transmission of map data to client devices; Reception of map data by client devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present application relates to the field of network technologies, and in particular, to a map rendering method, device, computer device, and storage medium.
  • rendering maps is generally difficult. Due to the conflict between the rendering method of the game and the rendering methods of existing map components, it is impossible to directly use the existing map components to render on the native side.
  • the game engine generally uses the Unity engine to render maps, but due to the performance of the Unity engine itself, it will cause runtime stalls and waste of memory, which will not only affect the effect of screen display, but also affect the normal operation of the game.
  • Various embodiments provided in this application provide a map rendering method, device, computer equipment, and storage medium.
  • a map rendering method including:
  • a block request is sent to the native map module of the terminal system through the 3D display engine, and the block request carries a block identifier corresponding to the target field of view;
  • the terminal calls a target map interface through the native map module, analyzes the block data, and generates map grid data based on the parsed data, where the map grid data includes road data and building data;
  • the terminal performs map rendering based on road data and building data in the map grid data.
  • a map rendering method including:
  • the terminal When the terminal receives the grid update instruction, it sends a block request to the native map module of the terminal system, where the block request carries the block identifier corresponding to the target field of vision;
  • map rendering is performed based on road data and building data in the map grid data.
  • a map rendering method including:
  • the terminal receives a block request sent by the three-dimensional display engine, where the block request carries a block identifier corresponding to the target field of view;
  • the terminal calls a target map interface, analyzes the block data, and generates map grid data based on the parsed data, where the map grid data includes road data and building data;
  • map rendering is performed based on road data and building data in the map grid data.
  • a map rendering device including:
  • a sending module configured to send a block request to the native map module of the terminal system when the grid update instruction is received, where the block request carries a block identifier corresponding to the target field of vision;
  • a receiving module configured to receive a block data address obtained by the native map module based on the block identifier
  • the sending module is further configured to obtain the block data in the target field of view based on the block data address, and send the block data to the native map module, and the native map module is based on the Block data to obtain map grid data, which includes road data and building data;
  • a rendering module is configured to perform map rendering based on road data and building data in the map grid data when a rendering instruction is received.
  • a map rendering device including:
  • a receiving module configured to receive a block request sent by a three-dimensional display engine, where the block request carries a block identifier corresponding to a target field of view;
  • An obtaining module configured to obtain a block data address based on the block identifier
  • a sending module configured to send the block data address to the three-dimensional display engine
  • the receiving module is further configured to receive block data obtained by the three-dimensional display engine based on the block data address;
  • a generating module configured to generate map grid data based on the parsed data, the map grid data including road data and building data;
  • a rendering module is configured to perform map rendering based on road data and building data in the map grid data when a rendering instruction is received.
  • a computer device in one aspect, includes a processor and a memory.
  • the memory stores computer-readable instructions.
  • the processor causes the processor to The method described in the above embodiment is performed.
  • one or more non-volatile storage media storing computer-readable instructions are provided.
  • the computer-readable instructions are executed by one or more processors, the one or more processors are caused to execute the foregoing embodiments. The method described.
  • FIG. 1 is an implementation environment diagram of a map rendering method according to an embodiment of the present application
  • FIG. 3 is a flowchart of a map rendering method according to an embodiment of the present application.
  • FIG. 4 is an example diagram of a target visual field range provided by an embodiment of the present application.
  • FIG. 5 is an example diagram of a target visual field range provided by an embodiment of the present application.
  • FIG. 6 is an example diagram of a target visual field range provided by an embodiment of the present application.
  • FIG. 7 is a flowchart of a rendering process provided by an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a map rendering device according to an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a map rendering device according to an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a map rendering device according to an embodiment of the present application.
  • Fig. 11 is a block diagram of a computer device 1100 according to an exemplary embodiment
  • FIG. 12 is an internal structure diagram of a terminal provided by an exemplary embodiment of the present application.
  • a mesh In rendering, a mesh represents a drawable entity.
  • a grid data contains at least a set of vertex data, and each vertex may contain attributes such as coordinates and normal vectors; a grid data may also contain index data for indexing the vertex data.
  • Shader In rendering, it refers to a piece of instruction code applied to the graphics processing unit (GPU), which is used to instruct the GPU to render the mesh and generate the rendering results, such as computing vertex transformations, computing lighting, etc.
  • GPU graphics processing unit
  • FIG. 1 is an implementation environment diagram of a map rendering method provided by an embodiment of the present application.
  • the terminal in this implementation environment includes a three-dimensional display engine, a native map module of the terminal system (for example, a link library may be adopted), a rendering module, and a target map interface.
  • the three-dimensional display engine may be a three-dimensional display engine based on application logic.
  • the three-dimensional display engine may be a Unity game engine.
  • the Unity game engine generally uses C # language for game logic. Development.
  • the above-mentioned three-dimensional display engine may be a C # script part related to game logic, and this part of the terminal system is irrelevant.
  • the native map module may refer to a module implemented based on a computer language such as C ++, and may be stored in a terminal system in the form of a link library for the 3D display engine to call.
  • the rendering module can be implemented by GPU. It can render the screen according to the grid data bound by the 3D display engine or the native map module to achieve the interface display on the terminal.
  • the target map interface can refer to a map API interface. To access the target map server to use any kind of map data service provided by the target map server.
  • the 3D display engine may trigger through a preset processing logic.
  • the native map module implements specific steps to achieve the rendering process through the linkage process of the two to achieve the rendering effect shown in FIG. 2.
  • FIG. 3 is a flowchart of a map rendering method according to an embodiment of the present invention.
  • a native map module is implemented by using a target link library as an example. 3, including:
  • the 3D display engine on the terminal receives a grid update instruction.
  • the three-dimensional display engine may be a three-dimensional display engine provided on the game side, and may read related game data to obtain relevant display data, thereby realizing display of a game interface.
  • the game configuration file may be stored on the terminal, or may be stored on the game server, and read by the terminal through access to the game server.
  • the grid update instruction refers to an instruction for updating a map grid.
  • the grid update instruction may be a frame update instruction, that is, an image frame to be displayed on a terminal. Update instructions.
  • the grid update instruction may be triggered based on the player's control of the virtual object or the movement of the terminal in which the player is logged in in the real world.
  • the above update may refer to loading grid data within the current field of view and unloading grid data beyond the field of view.
  • some games require the map to have the same size range as the real world, but because the player's field of vision is limited, it is necessary to dynamically load the map into the field of vision when the player moves on the map Section and unload the section that is out of view. Therefore, in some games, the map is equally divided into a two-dimensional grid. As the field of vision of the player moves, the blocks in the grid are dynamically loaded and unloaded.
  • the three-dimensional display engine sends a block request to the target link library of the terminal native system, and the block request carries a block identifier corresponding to the target field of view.
  • the three-dimensional display engine when the three-dimensional display engine needs to update the currently displayed image frame, it may first obtain the block identifier corresponding to the target field of vision according to the current positioning information of the terminal.
  • the block corresponding to the target field of view may refer to the block in the target field of view, that is, the blocks indicated by the determined block identifier are all within the target field of view, for example, as shown in FIG. 4 Or, at least a part of each of the determined blocks lies within the target field of view, such as the situation shown in FIG. 5.
  • the target field of view can match the size and shape of the terminal display area. Of course, it can also be slightly larger than the terminal display area, so that it can be displayed quickly when the subsequent user moves a small position, ensuring the display.
  • the block corresponding to the target field of vision may also refer to a block in the target field of view and an adjacent block in a certain area around the target field of view, and the certain area may refer to a block side. For example, as shown in FIG. 6.
  • the target field of view refers to a circular area with the positioning information as the center point and a preset distance as the radius, so that when the user controls the virtual object to rotate, he can observe the geographical situation around him.
  • the target field of view area may also be a rectangular area.
  • the above-mentioned determination of the target field of view may be performed in any manner, and the shape of the determined target field of view may also be determined according to user requirements or system defaults, which is not limited in the embodiment of the present invention.
  • the three-dimensional display engine may obtain the block identifier of the block located in the geographic position information from the game configuration file according to the real world geographical position information corresponding to the target field of view, and in the game In the configuration file, the block identifier can be stored corresponding to the geographic location information, so that the corresponding block identifier can be accurately obtained according to the determined different geographic location information.
  • the target link library obtains a block data address based on the block identifier.
  • the target link library may store a preset splicing formula.
  • the preset splicing formula may include an address splicing method for the 3D display engine.
  • the target link library may The block identifier is subjected to address splicing according to a preset splicing formula to obtain the block data address.
  • the obtaining process of step 303 may also be implemented by the following process: the target link library analyzes the block identifier to obtain the block data address corresponding to each block identifier. As to which acquisition method is used, this embodiment of the present application No restrictions.
  • the target link library sends the block data address to the three-dimensional display engine.
  • the three-dimensional display engine obtains block data in the target field of vision based on the block data address.
  • the three-dimensional display engine sends a download request to the target server, and the download request carries the block data address; when the target server receives and receives the block data in the target field of view sent by the target server.
  • the target server may be a server that provides a map data service for the three-dimensional display engine.
  • the three-dimensional display engine sends the block data in the target field of view to the target link library.
  • the embodiment of the present invention transfers this part of work to the target link library native to the terminal system. To reduce the workload of the 3D display engine and avoid the impact on the normal operation of the game application.
  • the target link library When the target link library receives the block data, it calls the target map interface to parse the block data.
  • the target map interface is used to associate with the target map service, so that the target link library can use the target map interface to call the target map service to parse the block data. Because the target link library generally does not have the ability to parse block data, the target map interface can be provided by the target map service to provide the parsing function, which can parse the block data and make the native target link library It can participate in the rendering process of the three-dimensional display engine, so that the processing pressure on the three-dimensional display engine in the entire rendering process is greatly reduced.
  • the target link library simplifies the road data and building data in the parsed data to obtain the map grid data.
  • the data obtained through the above analysis process includes road data and building data.
  • the road itself may have complex situations such as bending or crossing, and because the road itself has different sizes, the road data can be simplified. Specifically, it may include at least one of the following processes:
  • the road size of the road data is scaled to a preset size
  • the preset size can match the display area of the terminal, or the preset size can be a system preset size, so that the display of the road size can be compared with the terminal's The display area sizes match.
  • the preset size may also be determined according to a user's zoom operation on the terminal or a set zoom ratio, which is not limited in this embodiment of the present invention.
  • the embodiment of the present invention does not limit the processing method of the simplified processing.
  • the purpose of the simplified processing is to reduce the line density in the road data and improve the display clarity.
  • the data parsed by the target map interface includes road data including road vertex coordinates and index data, where the road vertex coordinates are three-dimensional vectors used to represent the vertices in three dimensions of xyz. Coordinates, however, the display in the embodiment of the present invention may not include fluctuations, and only needs to be displayed as the effect of tiling on the ground surface. Therefore, the road vertex coordinates may be reduced in dimension, and the processed road vertex coordinates may be two-dimensional.
  • the vector representation, that is, the processed road vertex coordinates need only have two dimensions of xy.
  • the game effect display also requires 3 floating-point parameters, based on the road vertex coordinates and 3 floating-point parameters after dimensionality reduction, for a vertex, it can be represented by a five-dimensional vector, which is more than the current one. Six-dimensional vector representation reduces processing.
  • road data may include data such as lakes, grasslands, and roads, which are not specifically limited in this embodiment of the present invention.
  • the road material includes the shader used for road rendering and the parameters required by the shader.
  • the target link library of the terminal native system is used for management. Dimension reduction processing. Therefore, the three-dimensional display engine can no longer support the data after the dimensionality reduction processing. Generally, the three-dimensional display engine only supports three-dimensional or four-dimensional vectors as coordinate parameters. Two-dimensional vector. Therefore, the shader implemented on the native side must be used, and the shader parameters must be passed from the native side to the rendering side.
  • the building data can be simplified into building outline data and height data, so that the final displayed building does not need to show the full picture of the building, only a brief outline can be displayed, which improves the efficiency of rendering and can greatly reduce The amount of data processed throughout the process.
  • the rendering process may be performed by an OpenGL device on the terminal.
  • the OpenGL device may perform rendering according to road data and building data bound by the 3D display engine and / or the target link library.
  • For a specific rendering process refer to the following embodiment process And the interactive process shown in Figure 6.
  • the system's native target link library can replace the three-dimensional display engine to process the map grid data, so that the three-dimensional display engine does not need to process any more.
  • a large amount of data will not cause excessive time overhead, avoiding run-time stalls and waste of memory, which can greatly improve the effect of screen display and will not affect the normal operation of the application.
  • the method provided in this embodiment based on the rendering function, integrates the rendering function on the native side into the rendering process of the 3D display engine itself to implement Map rendering in the native layer for a stutter-free experience.
  • step 309 in order to make the rendered scenes better, so that there is no overlap or other display effects, you can render based on the following principles:
  • first render transparency For objects that are greater than the target transparency, post-render objects that are less transparent than the target transparency. For example, all opaque objects should be rendered before all transparent objects.
  • Roads are opaque objects, so the rendering order of roads is higher.
  • Buildings are translucent objects, so they are rendered together with other translucent objects.
  • a target script is added to the scene's main camera.
  • the target script when the camera occurs the OnPreRender event, the road is an opaque object.
  • the terminal can call the native side by calling Road rendering interface to render lakes, grasses, and roads; when the camera generates an OnPostRender event, the terminal can render the buildings on the map by calling the building rendering interface on the native side.
  • the specific process can refer to the flow in Figure 7. :
  • the 3D display engine clears the background.
  • the 3D display engine (Unity) clears the screen with the background color, and clears the depth information buffer and the stencil buffer at the same time.
  • the previously loaded rendering data such as the graphics rendering context information, can be cleared.
  • the 3D display engine triggers the OnPreRender event function of the target script on the camera.
  • the camera's target script triggers the road rendering function of the map module on the game side.
  • the target link library sets the texture data of the road to the OpenGL context.
  • the material data includes the shader and the parameters required by the shader.
  • the target link library traverses all blocks, and for the blocks currently in the camera's field of view, bind the road grid to the OpenGL context.
  • the target link library After binding, the target link library notifies the OpenGL context to draw.
  • the 3D display engine renders opaque objects in the scene (except for roads).
  • the 3D display engine renders transparent objects in the scene (except buildings).
  • the 3D display engine triggers the OnPostRender event function in the target script on the camera.
  • the target script on the camera triggers the building rendering function of the 3D display engine.
  • the 3D display engine sets the building materials into the OpenGL context.
  • the 3D display engine triggers rendering of the building.
  • the target link library traverses all blocks, and binds the grid data in the building data to the OpenGL context for the block currently in the field of view of the camera.
  • the target link library After binding, the target link library notifies the OpenGL context to draw
  • the target link library binds the grid data in the road data to the context information of graphics rendering, and sets the material data in the road data in the context information of the graphics rendering;
  • the target link library binds the grid data in the building data to the graphics rendering context information, and the three-dimensional display engine sets the material data in the building data.
  • the graphics rendering context information can realize the rendering of a three-dimensional map in the three-dimensional display engine, instead of the map rendering of the native SDK, so as to achieve more complex rendering effects.
  • the grid update is performed on the native side of the system to achieve higher performance and Lower memory footprint.
  • the rendering function is also upgraded, so that when rendering roads, the target link library can be used to bind road materials, and when rendering buildings, 3D can be used.
  • the rendering engine's material system performs rendering, and the target link library's processing of map data is successfully embedded into the rendering of the 3D display engine, making it easier to achieve complex rendering effects.
  • steps in the embodiments of the present application are not necessarily performed sequentially in the order indicated by the step numbers. Unless explicitly stated in this document, the execution of these steps is not strictly limited, and these steps can be performed in other orders. Moreover, at least a part of the steps in each embodiment may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily performed at the same time, but may be performed at different times. The execution of these sub-steps or stages The sequence is not necessarily performed sequentially, but may be performed in turn or alternately with other steps or at least a part of the sub-steps or stages of other steps.
  • a terminal is further provided.
  • the terminal includes a map rendering device.
  • the map rendering device includes various modules, and each module may be implemented in whole or in part by software, hardware, or a combination thereof.
  • FIG. 8 is a structural diagram of a map rendering device according to an embodiment of the present application. Referring to FIG. 8, the device includes:
  • a first sending module 801 configured to send a block request to a native map module of a terminal system through a three-dimensional display engine when a grid update instruction is received, where the block request carries a block identifier corresponding to a target field of vision;
  • An address obtaining module 802 configured to obtain a block data address based on the block identifier through the native map module
  • a second sending module 803, configured to send the block data address to the three-dimensional display engine
  • a block data obtaining module 804 configured to obtain the block data in the target field of view based on the block data address by the three-dimensional display engine;
  • a parsing module 805 is configured to call a target map interface through the native map module, analyze the block data, and generate map grid data based on the parsed data, where the map grid data includes road data and building data ;
  • a rendering module 806 is configured to perform map rendering based on road data and building data in the map grid data.
  • the address acquisition module is configured to perform address stitching according to a preset stitching formula on the block identifier through the native map module to obtain the block data address.
  • the block data obtaining module is configured to send a download request to a target server through the three-dimensional display engine, the download request carrying the block data address; receiving the target sent by the target server Block data in the field of view.
  • the parsing module is configured to perform simplified processing on road data and building data in the parsed data through the native map module to obtain the map grid data.
  • the analysis module is configured to perform dimensionality reduction processing on road vertex coordinates in the road grid data through the native map module to obtain road vertex coordinates represented by a two-dimensional vector.
  • the rendering module is configured to render an object with a transparency greater than the target transparency during the rendering process, and then render an object with a transparency less than the target transparency.
  • the rendering module is configured to bind mesh data and texture data in the road data to context information of graphics rendering through the native map module;
  • the grid data in the building data is bound to context information of the graphics rendering, and the texture data in the building data is bound to context information of the graphics rendering by the three-dimensional display engine.
  • FIG. 9 is a structural diagram of a map rendering device according to an embodiment of the present application. Referring to FIG. 9, the device includes:
  • a sending module 901 is configured to send a block request to a native map module of a terminal system when the grid update instruction is received, where the block request carries a block identifier corresponding to a target field of vision area;
  • a receiving module 902 configured to receive a block data address obtained by the native map module based on the block identifier
  • the sending module 901 is further configured to obtain the block data in the target field of view based on the block data address, and send the block data to the native map module.
  • the block data is described to obtain map grid data, and the map grid data includes road data and building data;
  • a rendering module 903 is configured to perform map rendering based on road data and building data in the map grid data when a rendering instruction is received.
  • the rendering module 903 is configured to set the material data in the building data in the context information of the graphics rendering by the three-dimensional display engine, and the graphics rendering interface is based on the graphics rendering context Information for map rendering.
  • FIG. 10 is a structural diagram of a map rendering device according to an embodiment of the present application. Referring to FIG. 10, the device includes:
  • the receiving module 1001 is configured to receive a block request sent by a three-dimensional display engine, where the block request carries a block identifier corresponding to a target field of view;
  • An obtaining module 1002 configured to obtain a block data address based on the block identifier
  • a sending module 1003, configured to send the block data address to the three-dimensional display engine
  • the receiving module 1001 is further configured to receive block data obtained by the three-dimensional display engine based on the block data address;
  • a generating module 1005 is configured to generate map grid data based on the parsed data, where the map grid data includes road data and building data;
  • a rendering module 1006 is configured to perform map rendering based on road data and building data in the map grid data when a rendering instruction is received.
  • a rendering module 1006 is configured to set mesh data in the road data in context information of a graphics rendering interface through the native map module, and bind material data in the road data to The context information of the graphics rendering interface; the grid data in the building data is bound to the context information of the graphics rendering through the native map module, and the graphics rendering interface is based on the context rendering of the graphics Perform map rendering.
  • map rendering device provided in the foregoing embodiment only uses the division of the foregoing function modules as an example for map rendering.
  • the above functions may be allocated by different function modules as required.
  • the internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
  • map rendering device and the map rendering method embodiments provided by the foregoing embodiments belong to the same concept. For specific implementation processes, refer to the method embodiments, and details are not described herein again.
  • Fig. 11 is a block diagram of a computer device 1100 according to an exemplary embodiment.
  • the computer device 1100 includes a processing component 1122, which further includes one or more processors, and a memory resource represented by a memory 1132, for storing instructions executable by the processing component 1122, such as an application program.
  • the application program stored in the memory 1132 may include one or more modules each corresponding to a set of instructions.
  • the processing component 1122 is configured to execute instructions to perform the above-mentioned map rendering method.
  • the computer device 1100 may further include a power supply component 1126 configured to perform power management of the computer device 1100, a wired or wireless network interface 1180 configured to connect the computer device 1100 to a network, and an input / output (I / O) interface 1158 .
  • the computer device 1100 may operate based on an operating system stored in the memory 1132, such as Windows Server TM , Mac OS X TM , Unix TM , Linux TM , FreeBSD TM, or the like.
  • FIG. 12 shows an internal structure diagram of a terminal in one embodiment.
  • the terminal includes the computer device including a processor, a memory, a network interface, and an input device connected through a system bus.
  • the memory includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium of the terminal stores an operating system and can also store computer-readable instructions.
  • the processor can implement a map rendering method.
  • the internal memory may also store computer-readable instructions.
  • the processor may cause the processor to execute a map rendering method.
  • the input device may be a touch layer covered on a display screen, or a button, a trackball, or a touchpad provided on a computer equipment housing, or an external keyboard, a touchpad, or a mouse.
  • FIG. 12 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the terminal to which the solution of the present application is applied.
  • the specific terminal may include a comparison diagram. More or fewer components are shown, or certain components are combined, or have different component arrangements.
  • the map rendering device provided in this application may be implemented in the form of a computer-readable instruction, and the computer-readable instruction may run on a terminal as shown in FIG. 12.
  • the memory of the terminal may store various program modules constituting the map rendering device, such as a first sending module 801, an address obtaining module 802, a second sending module 803, a block data obtaining module 804, a parsing module 805, and a rendering module 806.
  • the computer-readable instructions constituted by each program module cause the processor to execute the steps in the map rendering method of each embodiment of the present application described in this specification.
  • An embodiment of the present application provides a computer-readable storage medium, where the computer-readable instructions are stored in the storage medium, and the computer-readable instructions are loaded by a processor and have the functions provided in the map rendering method of the foregoing embodiment. operating.
  • Non-volatile memory may include read-only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM) or external cache memory.
  • RAM is available in various forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain Synchlink DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDRSDRAM dual data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM synchronous chain Synchlink DRAM
  • Rambus direct RAM
  • DRAM direct memory bus dynamic RAM
  • RDRAM memory bus dynamic RAM

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Instructional Devices (AREA)

Abstract

本申请实施例公开了一种地图渲染方法、装置、计算机设备及存储介质,属于终端技术领域,其中方法包括:当接收到网格更新指令时,通过三维显示引擎向终端的原生地图模块发送区块请求,区块请求携带目标视野区域对应的区块标识;通过原生地图模块基于所述区块标识,获取区块数据地址,将区块数据地址发送至三维显示引擎;通过三维显示引擎基于所述区块数据地址,获取目标视野区域内的区块数据;通过原生地图模块调用目标地图接口,对区块数据进行解析,基于解析得到的数据,生成地图网格数据,地图网格数据包括道路数据和建筑数据;基于地图网格数据中的道路数据和建筑数据,进行地图渲染。

Description

地图渲染方法、装置、计算机设备及存储介质
本申请要求于2018年08月24日提交中国专利局,申请号为201810971215.6,申请名称为“地图渲染方法、装置以及计算机设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及网络技术领域,特别涉及一种地图渲染方法、装置、计算机设备及存储介质。
背景技术
随着终端技术的发展,对游戏界面的渲染成为显示技术的一个研究重点。例如,在移动游戏中,渲染地图一般是比较困难的,由于游戏的渲染方式与现有的地图组件的渲染方式冲突,导致无法直接使用现有的地图组件在原生侧进行渲染,因此,在目前的游戏应用中一般采用Unity引擎进行地图的渲染,但由于Unity引擎自身的性能原因,会导致运行时卡顿以及内存的浪费,不仅影响画面显示的效果,而且会影响游戏的正常运行。
发明内容
本申请提供的各种实施例,提供了一种地图渲染方法、装置、计算机设备及存储介质。
所述技术方案如下:
一方面,提供了一种地图渲染方法,包括:
当终端上的三维显示引擎接收到网格更新指令时,通过所述三维显示引擎向终端系统的原生地图模块发送区块请求,所述区块请求携带目标视野区域对应的区块标识;
所述终端通过所述原生地图模块基于所述区块标识,获取区块数据地址,将所述区块数据地址发送至所述三维显示引擎;
所述终端通过所述三维显示引擎基于所述区块数据地址,获取所述目标视野区域内的区块数据;
所述终端通过所述原生地图模块调用目标地图接口,对所述区块数据进行解析,基于解析得到的数据,生成地图网格数据,所述地图网格数据包括道路数据和建筑数据;
所述终端基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
一方面,提供了一种地图渲染方法,包括:
当终端接收到网格更新指令时,向终端系统的原生地图模块发送区块请求,所述区块请求携带目标视野区域对应的区块标识;
所述终端接收所述原生地图模块基于所述区块标识获取的区块数据地址;
所述终端基于所述区块数据地址,获取所述目标视野区域内的区块数据,将所述区块数据发送至所述原生地图模块,由所述原生地图模块基于所述区块数据,得到地图网格数据,所述地图网格数据包括道路数据和建筑数据;
当所述终端接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
一方面,提供了一种地图渲染方法,包括:
终端接收三维显示引擎发送的区块请求,所述区块请求携带目标视野区域对应的区块标识;
所述终端基于所述区块标识,获取区块数据地址,将所述区块数据地址发送至所述三维显示引擎;
所述终端接收所述三维显示引擎基于所述区块数据地址获取的区块数据;
所述终端调用目标地图接口,对所述区块数据进行解析,基于解析得到的数据,生成地图网格数据,所述地图网格数据包括道路数据和建筑数据;
当所述终端接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
一方面,提供了一种地图渲染装置,包括:
发送模块,用于当接收到网格更新指令时,向终端系统的原生地图模块发送区块请求,所述区块请求携带目标视野区域对应的区块标识;
接收模块,用于接收所述原生地图模块基于所述区块标识获取的区块数据地址;
所述发送模块还用于基于所述区块数据地址,获取所述目标视野区域内的区块数据,将所述区块数据发送至所述原生地图模块,由所述原生地图模块基于所述区块数据,得到地图网格数据,所述地图网格数据包括道路数据和建筑 数据;
渲染模块,用于当接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
一方面,提供了一种地图渲染装置,包括:
接收模块,用于接收三维显示引擎发送的区块请求,所述区块请求携带目标视野区域对应的区块标识;
获取模块,用于基于所述区块标识,获取区块数据地址;
发送模块,用于将所述区块数据地址发送至所述三维显示引擎;
所述接收模块,还用于接收所述三维显示引擎基于所述区块数据地址获取的区块数据;
调用模块,用于调用目标地图接口,对所述区块数据进行解析;
生成模块,用于基于解析得到的数据,生成地图网格数据,所述地图网格数据包括道路数据和建筑数据;
渲染模块,用于当接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
一方面,提供了一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行上述实施例所述的方法。
一方面,提供了一个或多个存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行上述实施例所述的方法。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征、目的和优点将从说明书、附图以及权利要求书变得明显。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的地图渲染方法的实施环境图;
图2是本申请实施例提供的渲染效果图;
图3是本申请实施例提供的一种地图渲染方法的流程图;
图4是本申请实施例实施提供的目标视野范围的一种示例图;
图5是本申请实施例实施提供的目标视野范围的一种示例图;
图6是本申请实施例实施提供的目标视野范围的一种示例图;
图7是本申请实施例提供的一种渲染过程流程图;
图8是本申请实施例提供的一种地图渲染装置的结构示意图;
图9是本申请实施例提供的一种地图渲染装置的结构示意图;
图10是本申请实施例提供的一种地图渲染装置的结构示意图;
图11是根据一示例性实施例示出的一种计算机设备1100的框图;
图12是本申请一个示例性实施例提供的终端的内部结构图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。为了便于对本申请实施例的理解,在此对一些名词进行解释:
网格数据(Mesh):在渲染中,一个网格(Mesh)代表一个可绘制的实体。一个网格数据至少包含一组顶点数据,每个顶点可以包含坐标、法向量等属性;一个网格数据也可以包含一个索引顶点数据用的索引数据。
Shader/着色器:在渲染中,指一段应用于图形处理器(GPU)的指令代码,用于指导GPU进行网格的渲染以及渲染结果的生成,如计算顶点变换、计算光照等。
图1是本申请实施例提供的地图渲染方法的实施环境图。参见图1,该实施环境中终端上包括三维显示引擎、终端系统的原生地图模块(例如可以采用链接库形式)、渲染模块以及目标地图接口。其中,三维显示引擎可以是基于应用逻辑来进行显示的三维显示引擎,例如,该三维显示引擎可以是Unity游戏引擎,作为一个跨平台的通用游戏引擎,Unity游戏引擎一般使用C#语言进行游戏逻辑的开发。上述三维显示引擎可以是与游戏逻辑相关的C#脚本部分,这 部分终端系统无关。原生地图模块可以是指基于C++等计算机语言实现的模块,可以采用链接库的形式存放在终端系统中,供三维显示引擎调用。渲染模块可以通过GPU实现,能够根据三维显示引擎或者原生地图模块所绑定的网格数据等,进行画面渲染,从而实现终端上的界面显示,目标地图接口可以是指一种地图API接口,用于对目标地图服务器进行访问,以使用目标地图服务器所提供的任一种地图数据服务。
需要说明的是,三维显示引擎和原生地图模块之间可以存在一定的触发关系,也即是,当三维显示引擎在接收到某种指令时,该三维显示引擎可以通过预先设置的处理逻辑,触发原生地图模块去实现特定的步骤,以通过两者的联动过程来实现渲染过程,以达到如图2的渲染效果。
基于图1所提供的实施环境,图3是本发明实施例提供的一种地图渲染方法的流程图,本发明实施例中,仅以原生地图模块通过目标链接库实现为例进行说明,参见图3,包括:
301、终端上的三维显示引擎接收网格更新指令。
该三维显示引擎可以是游戏侧提供的三维显示引擎,可以通过读取游戏配置文件,从而获取相关显示数据,从而实现游戏界面的显示。其中,该游戏配置文件可以是存储于终端上,也可以是存储于游戏服务器上,由终端通过对游戏服务器的访问来读取。
在本申请实施例中,网格更新指令是指对于地图网格进行更新的指令,在一些实施例中,该网格更新指令可以是帧更新指令,也即是,对终端待显示的图像帧进行更新的指令。该网格更新指令可以基于玩家对虚拟对象的控制或者玩家所登录的终端在真实世界中的移动触发。
需要说明的是,上述更新可以是指加载当前视野范围内的网格数据,卸载超出视野范围的网格数据。对于一些游戏场景较大的游戏,例如,一些游戏要求地图与真实世界的大小范围相同,但是,由于玩家的视野范围有限,因此需要在玩家在地图上移动时,动态加载进入视野范围内的地图部分,并卸载超出视野范围的部分。因此,在一些游戏中,地图被等距划分为一个二维网格,随着玩家走动视野范围变化,网格中的区块进行动态的加载和卸载。
302、该三维显示引擎向终端原生系统的目标链接库发送区块请求,该区块请求携带目标视野区域对应的区块标识。
在本申请实施例中,该三维显示引擎在需要对当前显示的图像帧进行更新时,可以先根据终端当前的定位信息,获取目标视野区域对应的区块标识。其中,该目标视野区域对应的区块,可以是指目标视野区域内的区块,也即是,所确定的区块标识所指示的区块均在目标视野范围内,例如图4所示的情形,或,所确定的区块中的每个区块至少有一部分位于目标视野范围内,例如图5所示的情形。该目标视野区域可以与终端显示区域的尺寸和形状相匹配,当然,还可以稍微大于该终端显示区域,以便能够在后续用户发生微小的位置移动时,也能够迅速的进行显示,保证了显示的连续性。也即是,该目标视野区域对应的区块,还可以是指目标视野区域内的区块和目标视野区域的周边一定区域内的相邻区块,该一定区域可以是指1个区块边长内,例如图6所示的情形。
例如,该目标视野区域是指以该定位信息为中心点、以预设距离为半径的圆形区域,以便用户在控制虚拟对象进行转动时,可以观察到四周的地理情况。当然,该目标视野区域还可以是一个矩形区域。当然,上述目标视野区域的确定可以采用任一种确定方式,所确定的目标视野区域的形状也可以根据用户需求或系统默认来确定,本发明实施例对此不做限定。
对于所确定的目标视野区域来说,三维显示引擎可以根据目标视野区域所对应的真实世界的地理位置信息,从游戏配置文件中获取位于该地理位置信息内的区块的区块标识,在游戏配置文件中,区块标识可以与地理位置信息对应存储,以便根据所确定的不同地理位置信息,准确的获取到对应的区块标识。
303、该目标链接库基于该区块标识,获取区块数据地址。
其中,目标链接库可以存储有预设拼接公式,该预设拼接公式可以包括为三维显示引擎服务的地址拼接方式,该目标链接库在获取到三维显示引擎提供的区块标识后,可以将该区块标识按照预设拼接公式进行地址拼接,得到该区块数据地址。又或者,该步骤303的获取过程也可以通过下述过程实现:目标链接库对区块标识进行分析,得到各个区块标识对应的区块数据地址,对于采用哪种获取方式,本申请实施例不做限定。
304、该目标链接库将该区块数据地址发送至该三维显示引擎。
305、该三维显示引擎基于该区块数据地址,获取该目标视野区域内的区块数据。
在本申请实施例中,该三维显示引擎向目标服务器发送下载请求,该下载请求携带该区块数据地址;当目标服务器接收到接收该目标服务器发送的该目 标视野区域内的区块数据。该目标服务器可以是指为该三维显示引擎提供地图数据服务的服务器。
306、该三维显示引擎将该目标视野区域内的区块数据发送至目标链接库。
如果通过三维显示引擎来对区块数据进行加载,则需要该三维显示引擎集中的处理大量数据,会造成大量时间开销,因此,本发明实施例将这部分工作转移到终端系统原生的目标链接库来执行,以降低三维显示引擎的工作量,也避免对游戏应用正常运行的影响。
307、当目标链接库接收到该区块数据时,调用目标地图接口,对该区块数据进行解析。
该目标地图接口用于与目标地图服务关联,使得目标链接库可以通过该目标地图接口,调用目标地图服务,来对区块数据进行解析。由于目标链接库一般不具备对区块数据进行解析的能力,因此,可以通过提供该目标地图接口,由目标地图服务来提供解析功能,能够实现对区块数据的解析,使得原生的目标链接库可以参与到三维显示引擎的渲染过程中,使得整个渲染过程的对三维显示引擎的处理压力大大减小。
308、该目标链接库对该解析得到的数据中的道路数据和建筑数据进行简化处理,得到该地图网格数据。
通过上述解析过程所得到的数据中包括道路数据和建筑数据,道路本身可能会有弯折或者交叉等复杂情况,且由于道路本身的尺寸不同,因此,可以对道路数据进行简化处理,该简化处理具体可以包括下述至少一种过程:
(1)将道路数据的道路尺寸进行缩放至预设尺寸,该预设尺寸可以与终端的显示区域匹配,或者该预设尺寸可以为系统预设尺寸,从而使得道路尺寸的显示可以与终端的显示区域尺寸匹配。当然,该预设尺寸还可以根据用户在终端上的缩放操作或所设置的缩放比例来确定,本发明实施例对此不做限定。
(2)将道路数据中道路交叉情况进行简化,本发明实施例对简化处理的处理方式不做限定,其简化处理的目的在于使得道路数据中的线条密度降低,提高显示的清晰度。
(3)对所述道路数据中的道路顶点坐标进行降维处理,得到以二维向量表示的道路顶点坐标。
在本申请实施例中,经目标地图接口解析得到的数据中包括道路数据中包括道路顶点坐标和索引数据,其中,该道路顶点坐标为三维向量,用于表示该 顶点在xyz三个维度上的坐标,但是,本发明实施例中的显示可以不包括高低起伏,仅需要显示为平铺于地表的效果即可,因此,可以将道路顶点坐标降维处理,处理后的道路顶点坐标采用二维向量表示,也即是,处理后的道路顶点坐标具有xy两个维度即可。
进一步地,由于游戏效果显示还需要3个浮点参数,则基于降维后的道路顶点坐标以及3个浮点参数,对于一个顶点来说,可以采用一个五维向量来表示,比目前采用的六维向量形式表示,降低了处理量。
需要说明的是,道路数据可以包括湖泊、草地和道路等数据,本发明实施例对此不做具体限定。
而对于道路材质来说,道路材质包括道路渲染所使用的着色器(Shader)和Shader需要的参数,在本发明实施例中,由终端原生系统的目标链接库来进行管理,由于对顶点进行了降维处理,因此,三维显示引擎已经不能够支持降维处理后的数据,一般地,三维显示引擎只支持三维或四维向量作为坐标参数,而本发明实施例中的顶点坐标已经被降维成二维向量,因此,必须使用原生侧实现的着色器,而着色器参数也要由原生侧自行传递到渲染侧。
对于建筑数据来说,可以将建筑数据简化为建筑轮廓数据和高度数据,从而使得最终显示的建筑无需显示建筑的全貌,只需显示简要的轮廓即可,提高了渲染的效率,也能够大大减少整个处理过程的数据处理量。
309、基于该地图网格数据中的道路数据和建筑数据,进行地图渲染。
该渲染过程可以由终端上的OpenGL设备执行,OpenGL设备可以根据由三维显示引擎和/或目标链接库所绑定的道路数据和建筑数据来进行渲染,具体的渲染流程可以参见下述实施例过程以及如图6的交互过程。
本申请实施例提供的方法,通过目标链接库和三维显示引擎之间的数据交互,由系统原生的目标链接库能够替代三维显示引擎来进行地图网格数据的处理,使得三维显示引擎不必再处理大量数据,也就不会造成过多的时间开销,避免了运行时的卡顿和对内存的浪费,能够大大提高画面显示的效果,不会影响应用的正常运行。尤其是,对于已经在原生侧实现了地图渲染的功能组件来说,该实施例所提供的方法,在渲染功能的基础上,将原生侧的渲染功能接入三维显示引擎自身的渲染流程,实现了原生层的地图渲染,达到无卡顿的体验。
在上述步骤309所示的渲染流程中,为了使得渲染出的场景效果更好,不至于互相出现重叠或者其他显示效果上的影响,可以基于下述原则进行渲染: 在渲染过程中,先渲染透明度大于目标透明度的目标物,后渲染透明度小于所述目标透明度的目标物。例如,所有的不透明物体应放在所有的透明物体之前渲染,道路属于不透明物体,因此道路放的渲染顺序靠前;而建筑物属于半透明物体,因此与其他半透明物体一同渲染出来。
为了将地图渲染融合到三维显示引擎的渲染流程中,在场景主相机上添加了目标脚本,该目标脚本在运行时,当相机发生OnPreRender事件时,道路属于不透明物体,终端可以通过调用原生侧的道路渲染接口,来对湖泊、草地和道路进行渲染;当相机发生OnPostRender事件时,终端可以通过调用原生侧的建筑渲染接口,来对地图上建筑物进行渲染,具体过程可以参见图7中的流程:
1、三维显示引擎清空背景。三维显示引擎(Unity)用背景色清空屏幕,同时清空深度信息缓冲(depth buffer)和模版缓冲(stencil buffer)。通过上述清空过程,可以清空之前已经加载的渲染数据,例如图形渲染的上下文信息等。
2、三维显示引擎触发相机上目标脚本的OnPreRender事件函数。
3、相机的目标脚本触发游戏侧地图模块的道路渲染函数。
4、触发目标链接库进行道路渲染。
5、目标链接库将道路的材质数据,设置到OpenGL Context中。其中,材质数据包括着色器和着色器需要的参数。
6、目标链接库遍历所有区块,对于当前在相机视野区域内的区块,将道路网格绑定到OpenGL Context中。
7、绑定后,目标链接库通知OpenGL Context进行绘制。
8、三维显示引擎进行场景中不透明物体的渲染(除道路外)。
9、三维显示引擎进行场景中透明物体的渲染(除建筑外)。
10、三维显示引擎触发相机上目标脚本中OnPostRender事件函数。
11、相机上的目标脚本触发三维显示引擎的建筑渲染函数。
12、三维显示引擎将建筑材质,设置到OpenGL Context中。
13、三维显示引擎触发对建筑进行渲染。
14、目标链接库遍历所有区块,对于当前在相机视野范围内的区块,将建筑数据中的网格数据绑定到OpenGL Context中。
15、绑定后,目标链接库通知OpenGL Context进行绘制
上述过程中,由所述目标链接库将所述道路数据中的网格数据绑定至图形渲染的上下文信息中,将所述道路数据中的材质数据设置在所述图形渲染的上下文信息中;而对于建筑数据,则由所述目标链接库将所述建筑数据中的网格数据绑定至所述图形渲染的上下文信息中,由所述三维显示引擎将所述建筑数据中的材质数据设置在所述图形渲染的上下文信息中。上述过程,能够实现在三维显示引擎中进行三维地图的渲染,代替了原生sdk的地图渲染,从而能实现更复杂的渲染效果,同时在系统原生侧进行网格更新,达到了较高的性能和较低的内存占用。
进一步地,在本申请实施例中,还对渲染功能进行了升级改造,使得在对道路进行渲染时,可以采用目标链接库来绑定道路材质,而在对建筑进行渲染时,则可以使用三维显示引擎的材质系统进行渲染,成功将目标链接库对地图数据的处理嵌入到三维显示引擎的渲染中,更容易实现复杂的渲染效果。
应该理解的是,本申请各实施例中的各个步骤并不是必然按照步骤标号指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,各实施例中至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
在一个实施例中,还提供了一种终端,该终端包括地图渲染装置,地图渲染装置中包括各个模块,每个模块可全部或部分通过软件、硬件或其组合来实现。
图8是本申请实施例提供的一种地图渲染装置的结构图,参见图8,该装置包括:
第一发送模块801,用于当接收到网格更新指令时,通过三维显示引擎向终端系统的原生地图模块发送区块请求,所述区块请求携带目标视野区域对应的区块标识;
地址获取模块802,用于通过所述原生地图模块基于所述区块标识,获取区块数据地址;
第二发送模块803,用于将所述区块数据地址发送至所述三维显示引擎;
区块数据获取模块804,用于通过所述三维显示引擎基于所述区块数据地址,获取所述目标视野区域内的区块数据;
解析模块805,用于通过所述原生地图模块调用目标地图接口,对所述区块数据进行解析,基于解析得到的数据,生成地图网格数据,所述地图网格数据包括道路数据和建筑数据;
渲染模块806,用于基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
在一些实施例中,所述地址获取模块用于:通过所述原生地图模块将所述区块标识按照预设拼接公式进行地址拼接,得到所述区块数据地址。
在一些实施例中,所述区块数据获取模块用于通过所述三维显示引擎向目标服务器发送下载请求,所述下载请求携带所述区块数据地址;接收所述目标服务器发送的所述目标视野区域内的区块数据。
在一些实施例中,所述解析模块,用于通过所述原生地图模块,对所述解析得到的数据中的道路数据和建筑数据进行简化处理,得到所述地图网格数据。
在一些实施例中,所述解析模块,用于通过所述原生地图模块对所述道路网格数据中的道路顶点坐标进行降维处理,得到以二维向量表示的道路顶点坐标。
在一些实施例中,所述渲染模块,用于在渲染过程中,先渲染透明度大于目标透明度的目标物,后渲染透明度小于所述目标透明度的目标物。
在一些实施例中,所述渲染模块,用于通过所述原生地图模块将所述道路数据中的网格数据和材质数据绑定至图形渲染的上下文信息中;通过所述原生地图模块将所述建筑数据中的网格数据绑定至所述图形渲染的上下文信息中,通过所述三维显示引擎将所述建筑数据中的材质数据绑定至所述图形渲染的上下文信息中。
图9是本申请实施例提供的一种地图渲染装置的结构图,参见图9,该装置包括:
发送模块901,用于当接收到网格更新指令时,向终端系统的原生地图模块发送区块请求,所述区块请求携带目标视野区域对应的区块标识;
接收模块902,用于接收所述原生地图模块基于所述区块标识获取的区块数 据地址;
所述发送模块901还用于基于所述区块数据地址,获取所述目标视野区域内的区块数据,将所述区块数据发送至所述原生地图模块,由所述原生地图模块基于所述区块数据,得到地图网格数据,所述地图网格数据包括道路数据和建筑数据;
渲染模块903,用于当接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
在一些实施例中,该渲染模块903,用于通过所述三维显示引擎将所述建筑数据中的材质数据设置在所述图形渲染的上下文信息中,由图形渲染接口基于所述图形渲染的上下文信息进行地图渲染。
图10是本申请实施例提供的一种地图渲染装置的结构图,参见图10,该装置包括:
接收模块1001,用于接收三维显示引擎发送的区块请求,所述区块请求携带目标视野区域对应的区块标识;
获取模块1002,用于基于所述区块标识,获取区块数据地址;
发送模块1003,用于将所述区块数据地址发送至所述三维显示引擎;
所述接收模块1001,还用于接收所述三维显示引擎基于所述区块数据地址获取的区块数据;
调用模块1004,用于调用目标地图接口,对所述区块数据进行解析;
生成模块1005,用于基于解析得到的数据,生成地图网格数据,所述地图网格数据包括道路数据和建筑数据;
渲染模块1006,用于当接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
在一些实施例中,渲染模块1006,用于通过所述原生地图模块将所述道路数据中的网格数据设置在图形渲染接口的上下文信息中,将所述道路数据中的材质数据绑定至所述图形渲染接口的上下文信息中;通过所述原生地图模块将所述建筑数据中的网格数据绑定至所述图形渲染的上下文信息中,由图形渲染接口基于所述图形渲染的上下文信息进行地图渲染。
需要说明的是:上述实施例提供的地图渲染装置在地图渲染时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分 配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的地图渲染装置与地图渲染方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
图11是根据一示例性实施例示出的一种计算机设备1100的框图。参照图11,计算机设备1100包括处理组件1122,其进一步包括一个或多个处理器,以及由存储器1132所代表的存储器资源,用于存储可由处理部件1122的执行的指令,例如应用程序。存储器1132中存储的应用程序可以包括一个或一个以上的每一个对应于一组指令的模块。此外,处理组件1122被配置为执行指令,以执行上述地图渲染方法。
计算机设备1100还可以包括一个电源组件1126被配置为执行计算机设备1100的电源管理,一个有线或无线网络接口1180被配置为将计算机设备1100连接到网络,和一个输入输出(I/O)接口1158。计算机设备1100可以操作基于存储在存储器1132的操作系统,例如Windows Server TM,Mac OS X TM,Unix TM,Linux TM,FreeBSD TM或类似。
图12示出了一个实施例中终端的内部结构图。如图12所示,该终端包括该计算机设备包括通过系统总线连接的处理器、存储器、网络接口、输入装置。其中,存储器包括非易失性存储介质和内存储器。该终端的非易失性存储介质存储有操作系统,还可存储有计算机可读指令,该计算机可读指令被处理器执行时,可使得处理器实现地图渲染方法。该内存储器中也可储存有计算机可读指令,该计算机可读指令被处理器执行时,可使得处理器执行地图渲染方法。输入装置可以是显示屏上覆盖的触摸层,也可以是计算机设备外壳上设置的按键、轨迹球或触控板,还可以是外接的键盘、触控板或鼠标等。
本领域技术人员可以理解,图12示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的终端的限定,具体的终端可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在一个实施例中,本申请提供的地图渲染装置可以实现为一种计算机可读指令的形式,计算机可读指令可在如图12所示的终端上运行。终端的存储器中可存储组成该地图渲染装置的各个程序模块,比如第一发送模块801、地址获取 模块802、第二发送模块803、区块数据获取模块804、解析模块805和渲染模块806。各个程序模块构成的计算机可读指令使得处理器执行本说明书中描述的本申请各个实施例的地图渲染方法中的步骤。
本申请实施例提供了一种计算机可读存储介质,所述存储介质中存储有计算机可读指令,该计算机可读指令由处理器加载并具有以实现上述实施例的地图渲染方法中所具有的操作。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一非易失性计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
本领域技术人员在考虑说明书及实践这里公开的申请后,将容易想到本申请的其它实施方案。本申请旨在涵盖本申请的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本申请的一般性原理并包括本申请未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本申请的真正范围和精神由下面的权利要求指出。
应当理解的是,本申请并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本申请的范围仅由所附的权利要求来限制。

Claims (15)

  1. 一种地图渲染方法,包括:
    当终端上的三维显示引擎接收到网格更新指令时,通过所述三维显示引擎向终端的原生地图模块发送区块请求,所述区块请求携带目标视野区域对应的区块标识;
    所述终端通过所述原生地图模块基于所述区块标识,获取区块数据地址,将所述区块数据地址发送至所述三维显示引擎;
    所述终端通过所述三维显示引擎基于所述区块数据地址,获取所述目标视野区域内的区块数据;
    所述终端通过所述原生地图模块调用目标地图接口,对所述区块数据进行解析,基于解析得到的数据,生成地图网格数据,所述地图网格数据包括道路数据和建筑数据;
    所述终端基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
  2. 根据权利要求1所述的方法,其特征在于,所述终端通过所述原生地图模块基于所述区块标识,获取区块数据地址包括:
    所述终端通过所述原生地图模块将所述区块标识按照预设拼接公式进行地址拼接,得到所述区块数据地址。
  3. 根据权利要求1所述的方法,其特征在于,所述终端通过所述三维显示引擎基于所述区块数据地址,获取所述目标视野区域内的区块数据包括:
    所述终端通过所述三维显示引擎向目标服务器发送下载请求,所述下载请求携带所述区块数据地址;
    所述终端接收所述目标服务器发送的所述目标视野区域内的区块数据。
  4. 根据权利要求1所述的方法,其特征在于,所述基于解析得到的数据,生成地图网格数据包括:
    所述终端通过所述原生地图模块,对所述解析得到的数据中的道路数据和建筑数据进行简化处理,得到所述地图网格数据。
  5. 根据权利要求4所述的方法,其特征在于,所述对所述解析得到的数据 中的道路数据和建筑数据进行简化处理,得到地图网格数据包括:
    所述终端通过所述原生地图模块对所述道路数据中的道路顶点坐标进行降维处理,得到以二维向量表示的道路顶点坐标。
  6. 根据权利要求1所述的方法,其特征在于,所述终端基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染包括:
    所述终端在渲染过程中,先渲染透明度大于目标透明度的目标物,后渲染透明度小于所述目标透明度的目标物。
  7. 根据权利要求6所述的方法,其特征在于,所述终端在渲染过程中,先渲染透明度大于目标透明度的目标物,后渲染透明度小于所述目标透明度的目标物包括:
    所述终端通过所述原生地图模块将所述道路数据中的网格数据绑定至图形渲染的上下文信息中,将所述道路数据中的材质数据设置在所述图形渲染的上下文信息中;
    所述终端通过所述原生地图模块将所述建筑数据中的网格数据绑定至所述图形渲染的上下文信息中,通过所述三维显示引擎将所述建筑数据中的材质数据设置在所述图形渲染的上下文信息中。
  8. 一种地图渲染方法,包括:
    当终端接收到网格更新指令时,向终端系统的原生地图模块发送区块请求,所述区块请求携带目标视野区域对应的区块标识;
    所述终端接收所述原生地图模块基于所述区块标识获取的区块数据地址;
    所述终端基于所述区块数据地址,获取所述目标视野区域内的区块数据,将所述区块数据发送至所述原生地图模块,由所述原生地图模块基于所述区块数据,得到地图网格数据,所述地图网格数据包括道路数据和建筑数据;
    当所述终端接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
  9. 根据权利要求8所述的方法,其特征在于,当所述终端接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染包括:
    所述终端通过所述三维显示引擎将所述建筑数据中的材质数据设置在所述图形渲染的上下文信息中,由图形渲染接口基于所述图形渲染的上下文信息进行地图渲染。
  10. 一种地图渲染方法,包括:
    终端接收三维显示引擎发送的区块请求,所述区块请求携带目标视野区域对应的区块标识;
    所述终端基于所述区块标识,获取区块数据地址,将所述区块数据地址发送至所述三维显示引擎;
    所述终端接收所述三维显示引擎基于所述区块数据地址获取的区块数据;
    所述终端调用目标地图接口,对所述区块数据进行解析,基于解析得到的数据,生成地图网格数据,所述地图网格数据包括道路数据和建筑数据;
    当所述终端接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
  11. 根据权利要求10所述的方法,其特征在于,当所述终端接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染包括:
    所述终端通过所述原生地图模块将所述道路数据中的网格数据设置在图形渲染接口的上下文信息中,将所述道路数据中的材质数据绑定至所述图形渲染接口的上下文信息中;
    所述终端通过所述原生地图模块将所述建筑数据中的网格数据绑定至所述图形渲染的上下文信息中,由图形渲染接口基于所述图形渲染的上下文信息进行地图渲染。
  12. 一种地图渲染装置,其特征在于,包括:
    发送模块,用于当接收到网格更新指令时,向终端系统的原生地图模块发送区块请求,所述区块请求携带目标视野区域对应的区块标识;
    接收模块,用于接收所述原生地图模块基于所述区块标识获取的区块数据地址;
    所述发送模块还用于基于所述区块数据地址,获取所述目标视野区域内的区块数据,将所述区块数据发送至所述原生地图模块,由所述原生地图模块基 于所述区块数据,得到地图网格数据,所述地图网格数据包括道路数据和建筑数据;
    渲染模块,用于当接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
  13. 一种地图渲染装置,其特征在于,包括:
    接收模块,用于接收三维显示引擎发送的区块请求,所述区块请求携带目标视野区域对应的区块标识;
    获取模块,用于基于所述区块标识,获取区块数据地址;
    发送模块,用于将所述区块数据地址发送至所述三维显示引擎;
    所述接收模块,还用于接收所述三维显示引擎基于所述区块数据地址获取的区块数据;
    调用模块,用于调用目标地图接口,对所述区块数据进行解析;
    生成模块,用于基于解析得到的数据,生成地图网格数据,所述地图网格数据包括道路数据和建筑数据;
    渲染模块,用于当接收到渲染指令时,基于所述地图网格数据中的道路数据和建筑数据,进行地图渲染。
  14. 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行如权利要求1至11任一项所述的方法。
  15. 一个或多个存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行如权利要求1至11任一项所述的方法。
PCT/CN2019/102035 2018-08-24 2019-08-22 地图渲染方法、装置、计算机设备及存储介质 WO2020038441A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2020550861A JP7085012B2 (ja) 2018-08-24 2019-08-22 マップレンダリング方法、装置、コンピュータ装置及びコンピュータプログラム
EP19852334.2A EP3753614B1 (en) 2018-08-24 2019-08-22 Map rendering method and apparatus, computer device and storage medium
US17/017,520 US11852499B2 (en) 2018-08-24 2020-09-10 Map rendering method and apparatus, computer device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810971215.6 2018-08-24
CN201810971215.6A CN109260708B (zh) 2018-08-24 2018-08-24 地图渲染方法、装置以及计算机设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/017,520 Continuation US11852499B2 (en) 2018-08-24 2020-09-10 Map rendering method and apparatus, computer device, and storage medium

Publications (1)

Publication Number Publication Date
WO2020038441A1 true WO2020038441A1 (zh) 2020-02-27

Family

ID=65154283

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/102035 WO2020038441A1 (zh) 2018-08-24 2019-08-22 地图渲染方法、装置、计算机设备及存储介质

Country Status (5)

Country Link
US (1) US11852499B2 (zh)
EP (1) EP3753614B1 (zh)
JP (1) JP7085012B2 (zh)
CN (1) CN109260708B (zh)
WO (1) WO2020038441A1 (zh)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111680347A (zh) * 2020-05-22 2020-09-18 重庆新创科技股份有限公司 一种道路标线设计方法、装置、计算机设备及存储介质
CN111773709A (zh) * 2020-08-14 2020-10-16 网易(杭州)网络有限公司 场景地图的生成方法及装置、计算机存储介质、电子设备
CN111784796A (zh) * 2020-06-22 2020-10-16 上海米哈游天命科技有限公司 一种地形网格生成方法、装置、设备和介质
CN112386911A (zh) * 2020-12-08 2021-02-23 网易(杭州)网络有限公司 导航网格生成方法、装置、非易失性存储介质及电子装置
CN112686948A (zh) * 2020-12-25 2021-04-20 北京像素软件科技股份有限公司 编辑器操作方法、装置和电子设备
CN114328769A (zh) * 2020-09-30 2022-04-12 中科星图股份有限公司 基于WebGL的北斗网格绘制方法及装置
CN115292434A (zh) * 2022-09-29 2022-11-04 四川省交通勘察设计研究院有限公司 一种基于地图引擎的gis路线可视化交互方法

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108072375B (zh) * 2016-11-09 2020-01-10 腾讯科技(深圳)有限公司 一种导航中的信息识别方法及终端
CN109260708B (zh) * 2018-08-24 2020-01-10 腾讯科技(深圳)有限公司 地图渲染方法、装置以及计算机设备
CN110032614B (zh) * 2019-04-18 2020-02-07 成都四方伟业软件股份有限公司 基于wasm的地图矢量渲染方法和装置
CN113498474B (zh) * 2020-01-21 2024-07-19 深圳元戎启行科技有限公司 高精度地图更新方法、装置、计算机设备和存储介质
CN111617467A (zh) * 2020-04-30 2020-09-04 珠海网易达电子科技发展有限公司 用于生成地图的方法、装置、电子设备和计算机存储介质
CN112807685A (zh) * 2021-01-22 2021-05-18 珠海天燕科技有限公司 基于游戏角色轨迹的草地渲染方法、装置及其设备
CN112988932B (zh) * 2021-03-10 2024-06-11 阿波罗智联(北京)科技有限公司 高精地图标注方法、装置、设备、可读存储介质及产品
CN113144614B (zh) * 2021-05-21 2024-08-16 苏州仙峰网络科技股份有限公司 基于Tiled Map的纹理采样贴图计算方法及装置
CN113689515B (zh) * 2021-07-21 2024-06-25 华东计算技术研究所(中国电子科技集团公司第三十二研究所) 地图渲染系统、方法及介质
CN114627221B (zh) * 2021-12-08 2023-11-10 北京蓝亚盒子科技有限公司 一种场景渲染方法、装置及运行器、可读存储介质
CN114398328A (zh) * 2022-01-17 2022-04-26 杭州电魂网络科技股份有限公司 一种数据处理方法、装置、电子设备以及存储介质
CN114225385B (zh) * 2022-02-25 2022-07-08 腾讯科技(深圳)有限公司 云游戏数据处理方法、装置、设备及存储介质
CN114615241B (zh) * 2022-03-03 2024-07-26 智道网联科技(北京)有限公司 基于高精地图的动态路网显示方法及相关设备
CN115690346B (zh) * 2022-11-14 2023-08-08 北京世冠金洋科技发展有限公司 一种三维地形的生成方法及装置
CN118097041A (zh) * 2022-11-18 2024-05-28 腾讯科技(深圳)有限公司 地图显示方法、装置、设备、存储介质及产品
KR102710974B1 (ko) * 2023-05-26 2024-09-30 주식회사 딥파인 3차원 맵 거래 중개 서비스 제공 방법 및 시스템

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103187003A (zh) * 2011-12-31 2013-07-03 北京图盟科技有限公司 一种电子地图的访问方法、设备和系统
US20130260846A1 (en) * 2012-03-29 2013-10-03 Empire Technology Development Llc Enabling location-based applications to work with imaginary locations
CN104958900A (zh) * 2015-06-26 2015-10-07 乐道互动(天津)科技有限公司 用于开发2d场景和3d角色的游戏引擎系统及调用方法
CN107423445A (zh) * 2017-08-10 2017-12-01 腾讯科技(深圳)有限公司 一种地图数据处理方法、装置及存储介质
CN108022285A (zh) * 2017-11-30 2018-05-11 杭州电魂网络科技股份有限公司 地图渲染方法及装置
CN109260708A (zh) * 2018-08-24 2019-01-25 腾讯科技(深圳)有限公司 地图渲染方法、装置以及计算机设备

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08262975A (ja) * 1995-03-28 1996-10-11 Pioneer Electron Corp 車載ナビゲーション装置
JP2001075967A (ja) 1999-08-31 2001-03-23 Denso Corp 地図データの更新用情報作成方法及び地図データの差分更新システム
JP3568159B2 (ja) 2001-03-15 2004-09-22 松下電器産業株式会社 三次元地図オブジェクト表示装置および方法、およびその方法を用いたナビゲーション装置
JP3582509B2 (ja) 2001-10-05 2004-10-27 朝日航洋株式会社 三次元地図データ処理方法、装置及びプログラム
JP2006284704A (ja) 2005-03-31 2006-10-19 Kitakyushu Foundation For The Advancement Of Industry Science & Technology 立体地図簡略化装置及び立体地図簡略化方法
JP2007133489A (ja) 2005-11-08 2007-05-31 Sony Corp 仮想空間画像表示方法、装置、仮想空間画像表示プログラム及び記録媒体
KR100790892B1 (ko) * 2006-10-18 2008-01-02 삼성전자주식회사 투명 객체의 화질 향상을 위한 3차원 그래픽스 데이터렌더링 방법 및 장치
JP2011197064A (ja) 2010-03-17 2011-10-06 Mitsubishi Electric Corp 3次元地図表示装置
US8655106B2 (en) * 2011-10-24 2014-02-18 Fannie Mae Automated valuation model with customizable neighborhood determination
US8803920B2 (en) 2011-12-12 2014-08-12 Google Inc. Pre-fetching map tile data along a route
KR101953133B1 (ko) 2012-02-27 2019-05-22 삼성전자주식회사 렌더링 장치 및 그 방법
US10109255B2 (en) * 2012-06-05 2018-10-23 Apple Inc. Method, system and apparatus for dynamically generating map textures
CN103915033A (zh) * 2012-12-29 2014-07-09 高德软件有限公司 一种地图渲染的方法及其装置、移动终端
CN106296813B (zh) * 2015-05-14 2018-11-13 上海市测绘院 三维静态地图生产方法
CN104966315B (zh) * 2015-05-18 2018-02-27 深圳市腾讯计算机系统有限公司 三维模型的处理方法和装置
CN104835202A (zh) * 2015-05-20 2015-08-12 中国人民解放军装甲兵工程学院 一种三维虚拟场景快速构建方法
CN105447101B (zh) * 2015-11-12 2020-01-07 北京锐安科技有限公司 一种地图引擎实现方法及装置
CN105894563B (zh) * 2016-04-25 2018-09-18 中国电子科技集团公司第二十八研究所 一种数字地球上的全球海洋效果模拟方法
CN107103072A (zh) * 2017-04-21 2017-08-29 北京视据科技有限公司 一种在移动终端显示空间图形数据的方法及系统
CN107890671B (zh) * 2017-12-05 2020-10-30 腾讯科技(深圳)有限公司 Web端的三维模型渲染方法、装置、计算机设备及存储介质
CN108109204B (zh) * 2017-12-18 2021-06-25 苏州蜗牛数字科技股份有限公司 一种制作和渲染大规模地形的方法及系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103187003A (zh) * 2011-12-31 2013-07-03 北京图盟科技有限公司 一种电子地图的访问方法、设备和系统
US20130260846A1 (en) * 2012-03-29 2013-10-03 Empire Technology Development Llc Enabling location-based applications to work with imaginary locations
CN104958900A (zh) * 2015-06-26 2015-10-07 乐道互动(天津)科技有限公司 用于开发2d场景和3d角色的游戏引擎系统及调用方法
CN107423445A (zh) * 2017-08-10 2017-12-01 腾讯科技(深圳)有限公司 一种地图数据处理方法、装置及存储介质
CN108022285A (zh) * 2017-11-30 2018-05-11 杭州电魂网络科技股份有限公司 地图渲染方法及装置
CN109260708A (zh) * 2018-08-24 2019-01-25 腾讯科技(深圳)有限公司 地图渲染方法、装置以及计算机设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3753614A4

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111680347A (zh) * 2020-05-22 2020-09-18 重庆新创科技股份有限公司 一种道路标线设计方法、装置、计算机设备及存储介质
CN111680347B (zh) * 2020-05-22 2023-10-27 重庆新创科技股份有限公司 一种道路标线设计方法、装置、计算机设备及存储介质
CN111784796A (zh) * 2020-06-22 2020-10-16 上海米哈游天命科技有限公司 一种地形网格生成方法、装置、设备和介质
CN111773709A (zh) * 2020-08-14 2020-10-16 网易(杭州)网络有限公司 场景地图的生成方法及装置、计算机存储介质、电子设备
CN111773709B (zh) * 2020-08-14 2024-02-02 网易(杭州)网络有限公司 场景地图的生成方法及装置、计算机存储介质、电子设备
CN114328769A (zh) * 2020-09-30 2022-04-12 中科星图股份有限公司 基于WebGL的北斗网格绘制方法及装置
CN112386911A (zh) * 2020-12-08 2021-02-23 网易(杭州)网络有限公司 导航网格生成方法、装置、非易失性存储介质及电子装置
CN112686948A (zh) * 2020-12-25 2021-04-20 北京像素软件科技股份有限公司 编辑器操作方法、装置和电子设备
CN115292434A (zh) * 2022-09-29 2022-11-04 四川省交通勘察设计研究院有限公司 一种基于地图引擎的gis路线可视化交互方法
CN115292434B (zh) * 2022-09-29 2022-12-13 四川省交通勘察设计研究院有限公司 一种基于地图引擎的gis路线可视化交互方法

Also Published As

Publication number Publication date
CN109260708A (zh) 2019-01-25
EP3753614A4 (en) 2021-05-12
US11852499B2 (en) 2023-12-26
US20200408558A1 (en) 2020-12-31
JP7085012B2 (ja) 2022-06-15
EP3753614A1 (en) 2020-12-23
JP2021516820A (ja) 2021-07-08
CN109260708B (zh) 2020-01-10
EP3753614B1 (en) 2023-06-07

Similar Documents

Publication Publication Date Title
WO2020038441A1 (zh) 地图渲染方法、装置、计算机设备及存储介质
CN105741228B (zh) 图形处理方法及装置
US11908039B2 (en) Graphics rendering method and apparatus, and computer-readable storage medium
US8347275B2 (en) OpenGL to OpenGL/ES translator and OpenGL/ES simulator
CN111400024B (zh) 渲染过程中的资源调用方法、装置和渲染引擎
CN106990961B (zh) 一种WebGL图形渲染引擎的建立方法
US11094036B2 (en) Task execution on a graphics processor using indirect argument buffers
US10825129B2 (en) Eliminating off screen passes using memoryless render target
CN113076152B (zh) 渲染方法及装置、电子设备和计算机可读存储介质
WO2023197762A1 (zh) 图像渲染方法、装置、电子设备、计算机可读存储介质及计算机程序产品
CN114419226A (zh) 全景渲染方法、装置、计算机设备和存储介质
CN114581580A (zh) 渲染图像的方法、装置、存储介质及电子设备
US8203567B2 (en) Graphics processing method and apparatus implementing window system
CN116433835A (zh) 三维增强现实操作系统的构造方法、装置和介质
CN113724364A (zh) 一种利用多边形实现遮挡且本体不渲染的设置方法及装置
CN113419806A (zh) 图像处理方法、装置、计算机设备和存储介质
CN116740241B (zh) 一种图像处理方法及电子设备
WO2022135050A1 (zh) 渲染方法、设备以及系统
CN113436325B (zh) 一种图像处理方法、装置、电子设备及存储介质
WO2023197729A1 (zh) 对象渲染方法、装置、电子设备及存储介质
CN115904592A (zh) 虚拟桌面的显示方法和装置
CN107479978B (zh) 电子书显示方法、装置及移动终端
CN114138385A (zh) Bim模型显示方法、装置、计算机设备、存储介质
CN117710180A (zh) 图像渲染方法及相关设备
CN118799471A (zh) 一种bim轻量化引擎的材质属性存储方法、装置及设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19852334

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020550861

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019852334

Country of ref document: EP

Effective date: 20200901

NENP Non-entry into the national phase

Ref country code: DE