CN111815788A - Three-dimensional map processing method, device, equipment and storage medium - Google Patents

Three-dimensional map processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN111815788A
CN111815788A CN202010708793.8A CN202010708793A CN111815788A CN 111815788 A CN111815788 A CN 111815788A CN 202010708793 A CN202010708793 A CN 202010708793A CN 111815788 A CN111815788 A CN 111815788A
Authority
CN
China
Prior art keywords
coordinate system
coordinate
projection
candidate
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010708793.8A
Other languages
Chinese (zh)
Other versions
CN111815788B (en
Inventor
肖春晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010708793.8A priority Critical patent/CN111815788B/en
Publication of CN111815788A publication Critical patent/CN111815788A/en
Application granted granted Critical
Publication of CN111815788B publication Critical patent/CN111815788B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides a three-dimensional map processing method, a three-dimensional map processing device, a three-dimensional map processing apparatus and a storage medium. The three-dimensional map processing method comprises the following steps: acquiring boundary information of a region to be eliminated in the three-dimensional map; generating a rejecting matching template associated with the region to be rejected in a first coordinate system based on the boundary information; determining a vertical projection coordinate of a candidate three-dimensional element in the three-dimensional map on a given plane under a second coordinate system, wherein the given plane is a plane corresponding to the area to be eliminated under the second coordinate system; determining a first coordinate corresponding to the candidate three-dimensional element in the first coordinate system based on the vertical projection coordinate of the candidate three-dimensional element on the given plane; and determining whether the candidate three-dimensional elements are eliminated or not based on the first coordinate corresponding to the vertical projection coordinate and an elimination matching template associated with the area to be eliminated in the first coordinate system.

Description

Three-dimensional map processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to a graphic processing technology, and more particularly, to a three-dimensional map processing method, apparatus, device, and storage medium.
Background
In order to meet the requirements of overlaying display of a three-dimensional (3D) indoor map on the three-dimensional map and adding a custom model on the three-dimensional map in other application scenes, the three-dimensional map needs to provide the capability of region elimination so as to avoid overlapping or collision conflict between an overlay model and the three-dimensional elements of the existing map. In the existing scheme, a mathematical geometric method is generally used for performing collision calculation, and in a data processing stage, a polygon in a rejection region and an element of a base map are subjected to collision calculation in a two-dimensional plane, for example, whether a specific point is in the polygon or whether the polygon and the polygon intersect or not is determined. However, the existing geometric collision scheme causes huge calculation overhead, resulting in high CPU occupancy. On the other hand, the existing geometric collision scheme is not suitable for eliminating application scenes with dynamically changed regions.
Therefore, there is a need for an efficient region culling method that is simple and convenient in scheme, low in computational overhead, and capable of adapting to dynamic changes of rendering targets.
Disclosure of Invention
The embodiment of the disclosure provides a three-dimensional map processing method, which includes: acquiring boundary information of a region to be eliminated in the three-dimensional map; generating a rejecting matching template associated with the region to be rejected in a first coordinate system based on the boundary information; determining a vertical projection coordinate of a candidate three-dimensional element in the three-dimensional map on a given plane under a second coordinate system, wherein the given plane is a plane corresponding to the area to be eliminated under the second coordinate system; determining a first coordinate corresponding to the candidate three-dimensional element in the first coordinate system based on the vertical projection coordinate of the candidate three-dimensional element on the given plane; and determining whether the candidate three-dimensional elements are eliminated or not based on the first coordinate corresponding to the vertical projection coordinate and an elimination matching template associated with the area to be eliminated in the first coordinate system.
According to the embodiment of the disclosure, generating the culling matching template associated with the to-be-culled area in the first coordinate system based on the boundary information comprises: generating an initial matching template for the three-dimensional map in the first coordinate system, wherein the size of the initial matching template is determined based on the display size of the three-dimensional map, and the pixel resolution of the initial matching template is set according to a predetermined resolution; and setting pixels corresponding to the region to be removed as having a first pixel value and setting the rest pixels as having a second pixel value different from the first pixel value in the initial matching template based on the boundary information of the region to be removed so as to generate the removal matching template.
According to an embodiment of the present disclosure, wherein determining the vertical projection coordinates of the candidate three-dimensional element in the three-dimensional map on the given plane comprises: acquiring coordinates of the candidate three-dimensional elements in the second coordinate system as second coordinates; determining a plane corresponding to the area to be eliminated in the second coordinate system as the given plane; and determining the vertical projection coordinates of the candidate three-dimensional element on the given plane based on the second coordinates.
According to an embodiment of the present disclosure, wherein obtaining the coordinates of the candidate three-dimensional element in the second coordinate system as second coordinates includes: and acquiring the coordinates of the vertexes of the candidate three-dimensional elements under the second coordinate system as the second coordinates.
According to an embodiment of the present disclosure, wherein obtaining the coordinates of the candidate three-dimensional element in the second coordinate system as second coordinates includes: and acquiring the coordinates of the centroid points of the candidate three-dimensional elements in the second coordinate system as the second coordinates.
According to an embodiment of the present disclosure, determining the first coordinate corresponding to the candidate three-dimensional element in the first coordinate system includes: determining a coordinate transformation relationship between the first coordinate system and the second coordinate system; and performing coordinate transformation on the vertical projection coordinate based on the coordinate transformation relation to determine a first coordinate corresponding to the vertical projection coordinate of the candidate three-dimensional element in the first coordinate system.
According to the embodiment of the present disclosure, the first coordinate system is a texture coordinate system, and the second coordinate system is a space coordinate system; wherein determining a coordinate transformation relationship between the first coordinate system and the second coordinate system comprises: determining a view projection relationship between the first coordinate system and a projection coordinate system; determining a translational scaling relationship between the projection coordinate system and the second coordinate system; and determining a coordinate transformation relationship between the first coordinate system and the second coordinate system based on the view projection relationship and the translation scaling relationship.
According to an embodiment of the present disclosure, wherein determining the view projection relationship between the first coordinate system and the projection coordinate system comprises: determining shooting parameters of a virtual camera, wherein the shooting parameters comprise a spatial position and a shooting angle of the virtual camera, and the shooting parameters of the virtual camera are associated with the projection coordinate system; determining projection parameters of a projection plane, wherein the projection parameters comprise a projection window position of the projection plane, and the projection parameters of the projection plane are associated with the projection coordinate system; and determining a view projection relationship between the first coordinate system and a projection coordinate system based on the shooting parameters of the virtual camera and the projection parameters of the projection plane.
According to an embodiment of the present disclosure, wherein the view projection relationship is a view projection matrix from the first coordinate system to the projection coordinate system; the translation scaling relationship is a translation scaling matrix from the projection coordinate system to the second coordinate system; wherein determining the corresponding first coordinate of the vertical projection coordinate of the candidate three-dimensional element in the first coordinate system comprises: determining the first coordinate based on the vertical projection coordinate, the view projection matrix, and the translation scaling matrix of the candidate three-dimensional element.
According to an embodiment of the present disclosure, wherein determining the first coordinate based on the vertical projection coordinate, the view projection matrix, and the translation scaling matrix of the candidate three-dimensional element comprises: generating a third coordinate corresponding to the vertical projection coordinate in the projection coordinate system based on the vertical projection coordinate and the view projection matrix; and determining a first coordinate corresponding to the vertical projection coordinate of the candidate three-dimensional element in the first coordinate system based on the third coordinate and the translation scaling matrix.
According to the embodiment of the disclosure, in the first coordinate system, determining whether to reject the candidate three-dimensional element based on the first coordinate corresponding to the vertical projection coordinate and the reject matching template associated with the region to be rejected includes: acquiring a pixel value corresponding to the first coordinate from the eliminating matching template; and determining to cull the candidate three-dimensional element if at least a portion of the pixel values are first pixel values.
An embodiment of the present disclosure provides a three-dimensional map processing apparatus, including: the acquisition module is used for acquiring boundary information of an area to be eliminated in the three-dimensional map; the template generating module is used for generating a rejection matching template associated with the region to be rejected in a first coordinate system based on the boundary information; the projection determination module is used for determining vertical projection coordinates of candidate three-dimensional elements in the three-dimensional map on a given plane in a second coordinate system, wherein the given plane is a plane corresponding to the area to be eliminated in the second coordinate system; a coordinate transformation module, configured to determine, based on the vertical projection coordinates of the candidate three-dimensional element on the given plane, first coordinates corresponding to the candidate three-dimensional element in the first coordinate system; and the template matching module is used for determining whether the candidate three-dimensional elements are removed or not based on the first coordinate corresponding to the vertical projection coordinate and a removal matching template associated with the area to be removed in the first coordinate system.
According to the embodiment of the disclosure, in the first coordinate system, determining whether to reject the candidate three-dimensional element based on the first coordinate corresponding to the vertical projection coordinate and the reject matching template associated with the region to be rejected includes: acquiring a pixel value corresponding to the first coordinate from the eliminating matching template; and determining to cull the candidate three-dimensional element if at least a portion of the pixel values are first pixel values.
An embodiment of the present disclosure provides a three-dimensional map processing apparatus including: a processor; and a memory having stored thereon computer-executable instructions for implementing the method as described above when executed by the processor.
Embodiments of the present disclosure provide a computer-readable storage medium having stored thereon computer-executable instructions for implementing the method as described above when executed by a processor.
Embodiments of the present disclosure provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the three-dimensional map processing method according to the embodiment of the present disclosure.
The embodiment of the disclosure provides a three-dimensional map processing method, a three-dimensional map processing device, three-dimensional map processing equipment and a storage medium. The three-dimensional map processing method provided by the embodiment of the disclosure forms a binary template texture by rendering and eliminating the polygon of the area in the frame buffer, then converts the vertex coordinates of the rendering base map elements (such as buildings, points of interest (POI) and the like) into a texture coordinate system after the two-dimensionalization, and determines whether to discard the elements through texture pixels. In addition, the implementation processes of the method are all in the rendering stage, are decoupled from other modules, and can adapt to the dynamic change of the rendering target (or the rejection region).
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly introduced below. It is apparent that the drawings in the following description are only exemplary embodiments of the disclosure, and that other drawings may be derived from those drawings by a person of ordinary skill in the art without inventive effort.
Fig. 1 shows an application scenario diagram of a three-dimensional map processing method according to an embodiment of the present disclosure.
Fig. 2 shows a processing diagram of a geometric collision-based position comparison method.
Fig. 3 shows a process diagram of a separation axis algorithm based on geometric collision.
Fig. 4 shows a flowchart of a three-dimensional map processing method based on template testing according to an embodiment of the present disclosure.
Fig. 5 shows a processing diagram of a three-dimensional map processing method based on a template test according to an embodiment of the disclosure.
FIG. 6 shows a schematic diagram of a process of spatial transformation from a two-dimensional plane to a screen space according to an embodiment of the present disclosure.
Fig. 7 shows a flowchart of a three-dimensional map processing method according to an embodiment of the present disclosure.
Fig. 8a and 8b respectively show schematic diagrams of a culling matching template generated based on boundary information of a region to be culled in a texture coordinate system and a corresponding texture image thereof according to an embodiment of the present disclosure.
FIG. 9 illustrates a schematic vertical projection of a candidate three-dimensional element according to an embodiment of the disclosure.
FIG. 10 shows a schematic diagram of a coordinate transformation process according to an embodiment of the present disclosure.
FIG. 11 is a schematic diagram illustrating the determination of whether to cull a candidate three-dimensional element from a culling matching template according to an embodiment of the disclosure.
Fig. 12 illustrates an exemplary process flow of a three-dimensional map processing method according to an embodiment of the present disclosure.
Fig. 13 shows a schematic diagram of a three-dimensional map processing apparatus according to an embodiment of the present disclosure.
Fig. 14 shows a schematic diagram of a three-dimensional map processing device according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more apparent, example embodiments according to the present disclosure will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of the embodiments of the present disclosure and not all embodiments of the present disclosure, with the understanding that the present disclosure is not limited to the example embodiments described herein.
In the present specification and the drawings, substantially the same or similar steps and elements are denoted by the same or similar reference numerals, and repeated descriptions of the steps and elements will be omitted. Meanwhile, in the description of the present disclosure, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance or order.
In the specification and drawings, elements are described in singular or plural according to embodiments. However, the singular and plural forms are appropriately selected for the proposed cases only for convenience of explanation and are not intended to limit the present disclosure thereto. Thus, the singular may include the plural and the plural may also include the singular, unless the context clearly dictates otherwise.
Embodiments of the present disclosure relate to a graphic processing scenario of a three-dimensional map, and for ease of understanding, some basic concepts related to embodiments of the present disclosure are first described below.
OpenGL: the Open Graphics Library (Open Graphics Library) is a cross-language, cross-platform application programming interface for rendering two-dimensional, three-dimensional vector Graphics, and the like.
WebGL: the Web Graphics Library (Web Graphics Library) is a three-dimensional drawing protocol, and the drawing technology standard allows JavaScript and OpenGL ES 2.0 to be combined together to render two-dimensional or three-dimensional Graphics through a canvas (canvas) label in a browser.
Frame buffering: OpenGL or WebGL render graphics that require a color buffer for writing color, a depth buffer for writing depth information, and a stencil buffer specifying discarded fragments, all in combination are referred to as frame buffering.
And (3) template testing: during fragment (e.g., pixel) processing, it is determined whether to discard the corresponding fragment by the template value in the template buffer.
Embodiments of the present disclosure will be further described with reference to the accompanying drawings.
Fig. 1 shows an application scenario diagram of a three-dimensional map processing method according to an embodiment of the present disclosure. Specifically, fig. 1 shows an application scenario of the indoor and outdoor integrated three-dimensional map 100.
As shown in the left part of fig. 1, in the outdoor map display mode, the indoor-outdoor integrated three-dimensional map 100 may present an outdoor map including a plurality of base map elements (three-dimensional buildings, points of interest (POIs), etc., such as three-dimensional element 102, three-dimensional element 103) on a map display interface. In one embodiment, when a specific area on the map, for example, an area corresponding to a certain mall building, is further enlarged, the indoor and outdoor integrated three-dimensional map 100 may enter an indoor map display mode. In this case, the indoor and outdoor integrated three-dimensional map 100 may determine an area corresponding to the indoor map of the mall as an area to be rejected 101, perform position comparison between candidate three-dimensional elements (e.g., three-dimensional element 102 and three-dimensional element 103) currently being presented and the area to be rejected 101, reject three-dimensional elements (e.g., three-dimensional element 102) located inside the area to be rejected 101 or at least partially overlapping with the area to be rejected 101, and present indoor map elements 104 (e.g., two-dimensional maps of respective businesses on respective floors inside the mall) corresponding to the mall in the area to be rejected 101, as shown in the right part of fig. 1.
To this end, fig. 2 shows a processing diagram of a geometric collision-based position comparison method 200.
As shown in fig. 2, in the data processing stage, the region to be rejected 101 and candidate three-dimensional elements (e.g., the three-dimensional element 202, the three-dimensional element 203, and the three-dimensional element 204) may be subjected to collision calculation in a two-dimensional plane, for example, it may be determined whether a two-dimensional projection of each candidate three-dimensional element on a plane corresponding to the region to be rejected 101 is located in the region to be rejected 101, or is intersected with the region to be rejected 101. For example, in the case shown in fig. 2, based on the determination result, the three-dimensional elements 202 and 203 whose two-dimensional projections are located inside the region to be culled 101 may be determined as elements to be culled, and the three-dimensional element 204 may be determined as elements not to be culled.
However, the geometric collision scheme as shown in fig. 2 may cause a large computational overhead, resulting in a high CPU occupancy. One is that because the candidate elements on the base graph are numerous and complex in type (including points, lines, planes, volumes, etc.), generally, the whole graph is completeEach screen comprises about 20-40 tiles (tiles), and each tile comprises about 100-600 base map elements, so that the calculation amount is large; secondly, because the geometric collision algorithm is complex, the judgment of the intersection, inclusion and other conditions of the complex polygon is involved, the most direct algorithm is to traverse the vertexes of the candidate elements to judge the relation between the candidate elements and the region to be eliminated, but under the condition that n vertexes exist, the complexity reaches O (n)2)。
FIG. 3 shows a process diagram of a separation axis algorithm 300 based on geometric collision. As shown in fig. 3, taking the candidate element 301 and the quadrilateral 302 of the region to be removed as an example, a plane (e.g., plane 1 '-7') perpendicular to all the edges (e.g., edges 1-7) of the candidate element 301 and the quadrilateral 302 of the region to be removed to be subjected to position alignment may be determined, and then it is determined whether the projections of the candidate element 301 and the quadrilateral 302 of the region to be removed on each plane intersect. In one embodiment, if the projection of candidate element 301 and region to be culled quadrilateral 302 on at least one plane do not intersect, then candidate element 301 and region to be culled quadrilateral 302 may be determined to be not intersecting.
For simplicity of description, fig. 3 only shows the projection of candidate element 301 and region-to-be-culled quadrilateral 302 onto planes 1 'and 2'. As shown in FIG. 3, the candidate element 301 and the projection 301-1 'and 301-1' of the quadrilateral 302 of the region to be eliminated on the plane 1 'intersect, while the projection 301-2' and 302-2 'on the plane 2' do not intersect. Therefore, in one embodiment, according to the above-mentioned determination rule, the candidate element 301 and the quadrilateral 302 of the region to be culled may be determined to be disjoint based on the projection disjointness of the candidate element 301 and the quadrilateral 302 of the region to be culled on at least one plane (e.g., plane 2').
The split-axis algorithm 300 shown in fig. 3 is relatively efficient, but is generally applicable to the case of convex polygons, and in the case of concave polygons, the concave polygons need to be first decomposed into a combination of convex polygons and then subjected to subsequent processing, which may increase the computational overhead.
On the other hand, in the three-dimensional map rendering according to the present disclosure, in general, in order to improve rendering performance, elements of the same type (for example, a plurality of three-dimensional buildings, a plurality of POIs, or the like) on the base map may be merged and then batch-drawn, which requires merging vertex data obtained by triangulating the elements of the same type in the tile in a data analysis stage before the rendering stage, and caching the parsed data to avoid repeated processing. However, the collision calculation as shown in fig. 2 or fig. 3 is a process for a single element, and therefore must be performed before the steps of triangulation, data merging, buffering, and the like. When the culling area (e.g., the area to be culled 101 or the area to be culled quadrilateral 302) is dynamically changed, for example, a user dynamically moves a map display window, dynamically loads or removes a three-dimensional covering, and the like, collision calculation needs to be performed again, the culling of the base map elements needs to be performed again, and then the data parsing process described above is performed again on the base map elements that do not need to be culled, so that a huge calculation overhead may be caused. Therefore, the collision calculation scheme as shown in fig. 2 or fig. 3 is not suitable for application scenarios in which the culling area dynamically changes.
In view of this, embodiments of the present disclosure provide a three-dimensional map processing method 400, which can implement element elimination during rendering, can greatly reduce the amount of computation compared to the collision computation scheme shown in fig. 2 or fig. 3, and can adapt to dynamic changes of the eliminated area. Specifically, fig. 4 shows a flowchart of a three-dimensional map processing method 400 based on a template test according to an embodiment of the present disclosure, and fig. 5 shows a processing diagram of the three-dimensional map processing method 400 based on the template test according to an embodiment of the present disclosure.
Specifically, an OpenGL or WebGL processing scenario is taken as an example for description. In OpenGL or WebGL, a stencil test refers to determining whether to discard a corresponding tile during processing of the tile (e.g., pixel) by a stencil value in a stencil buffer. As shown in fig. 4 and 5, in one embodiment, first, in step S401, map data including all of the base map elements may be rendered into a color buffer 501 (i.e., corresponding to the screen space of the final presentation). Then, in step S402, a corresponding culling matching template 502 may be generated in a template buffer in the frame buffer based on boundary information of the area to be culled (e.g., boundary latitude and longitude information of the area to be culled). For example, as shown in fig. 5, the pixel corresponding to the region to be culled may be set to have a first pixel value (e.g., 0), while the remaining pixels in the culling matching template 502 may be set to have a second pixel value (e.g., 1) different from the first pixel value. Next, in step S403, the pixels in the color buffer 501 may be stencil tested based on the cull matching stencil 502 generated in the stencil buffer. In one embodiment, whether to discard a particular pixel in the subsequent rendered presentation may be determined according to the pixel value of the particular pixel in the color buffer 501 at the corresponding location in the culling matching template 502. For example, if the pixel value of a particular pixel in the color buffer 501 at the corresponding location in the culling match template 502 is 0, it may be determined that the particular pixel will be discarded in the subsequent rendered presentation, and if the pixel value of the particular pixel in the color buffer 501 at the corresponding location in the culling match template 502 is 1, it may be determined that the particular pixel will not be discarded in the subsequent rendered presentation. As shown in fig. 5, after the template matching, the pixels corresponding to the culled area 504 are discarded in the template matching result 503.
Still taking OpenGL or WebGL as an example, the above-mentioned template matching or template testing process is typically performed after the fragment shader stage. That is, in this case, the template matching and culling process can be generally performed only on pixels in the color buffer in the screen space. Here, the screen space may be a final display space corresponding to a display screen, which may show a three-dimensional graphic element (e.g., pseudo 3D graphic) in the form of a two-dimensional plane. Fig. 6 shows a schematic diagram of a process 600 of spatial transformation from a two-dimensional plane to a screen space according to an embodiment of the disclosure.
In the embodiment shown in fig. 6, under the two-dimensional plane, the region to be removed 601 is square, the projection of the three-dimensional element 602 is located inside the region to be removed 601, and the projection of the three-dimensional element 603 is located outside the region to be removed 601. However, after the spatial transformation, under the screen space, the region to be culled 601 is transformed from a square to a region to be culled 601 ' in a parallelogram form, and the three-dimensional element 602 and the three-dimensional element 603 increase the depth (i.e., height) dimension, changing from a square to a three-dimensional element 602 ' and a three-dimensional element 603 ' in a cube form. Thus, as shown in FIG. 6, if the template matching and culling process is performed in screen space, the three-dimensional element 603 '(or portions thereof) may be mistakenly culled, and the three-dimensional element 602' may not be completely culled.
In this regard, embodiments of the present disclosure further provide an improved three-dimensional map processing method 700, as shown in fig. 7.
Fig. 7 shows a flow diagram of a three-dimensional map processing method 700 according to an embodiment of the disclosure.
As shown in fig. 7, first, in step S701, boundary information of an area to be eliminated in a three-dimensional map may be acquired. In one embodiment, the area to be removed may be a two-dimensional plane area, and the boundary information of the area to be removed may be boundary longitude and latitude information of the area to be removed. For example, taking the area to be rejected 601 in fig. 6 as an example, the upper and lower boundaries of the area to be rejected 601 may be 30 ° 31'38 "and 30 ° 31' 26" north latitude, respectively, with respect to a specific map origin, and the left and right boundaries may be 135 ° 35'25 "east longitude and 135 ° 35' 39" east longitude, respectively, with respect to the specific map origin. In one embodiment, the boundary information of the region to be rejected may also be two-dimensional coordinates or three-dimensional coordinates representing a vertex of the region to be rejected in a map plane coordinate system or a three-dimensional space coordinate system. In other embodiments, the boundary information of the region to be culled may also be any other form of information capable of characterizing the extent of the region to be culled.
In step S702, a culling matching template associated with the region to be culled may be generated in the first coordinate system based on the boundary information.
Specifically, fig. 8a and 8b respectively show schematic diagrams of a culling matching template 810 generated based on boundary information of a region to be culled in a texture coordinate system and a corresponding texture image 820 thereof according to an embodiment of the present disclosure.
As shown in fig. 8a, in one embodiment, the first coordinate system may be a texture coordinate system, and the initial matching template for the three-dimensional map may be generated under the texture coordinate system. In this embodiment, the size of the initial matching template may be determined based on the display size of the three-dimensional map. Specifically, the size of the initial matching template for the three-dimensional map may be determined based on the size of a display screen (not shown) to be used for displaying the three-dimensional map. For example, in one embodiment, the size of the display screen for displaying the three-dimensional map may be directly determined as the size of the initial matching template, or the size of the display screen for displaying the three-dimensional map after scaling by a certain scale may be determined as the size of the initial matching template, and so on. Further, the pixel resolution of the initial matching template may be set at a predetermined resolution, for example, a preset display resolution of a display screen for displaying the three-dimensional map may be set to the pixel resolution of the initial matching template (e.g., 8 × 10 as shown in fig. 8 a).
In addition, in this embodiment, based on the boundary information of the region to be rejected, in the initial matching template, the pixels corresponding to the region to be rejected are set to have a first pixel value, and the remaining pixels in the initial matching template are set to have a second pixel value different from the first pixel value, so as to generate a rejection matching template associated with the region to be rejected. For example, in one embodiment, as shown in fig. 8a, the boundary information of the region to be culled acquired in step S701 may be converted into boundary information in a texture coordinate system, and based on the boundary information in the texture coordinate system, in the initial matching template, the pixel corresponding to the region to be culled is set to have a first pixel value (e.g., 0), and the remaining pixels in the initial matching template are set to have a second pixel value (e.g., 1). Thus, a culling matching template 810 associated with the region to be culled may be generated. FIG. 8b shows a schematic diagram of a texture image 820 corresponding to the culling matching template 810. As shown in fig. 8b, the white parallelogram region may correspond to a region to be culled under the texture coordinate system, and the remaining black background portion may correspond to a region that does not need to be culled.
Next, in step S703, the vertical projection coordinates of the candidate three-dimensional elements in the three-dimensional map on a given plane may be determined in the second coordinate system, and the given plane may be a plane corresponding to the region to be eliminated in the second coordinate system.
In one embodiment, the coordinates of the candidate three-dimensional element in the second coordinate system may be acquired as second coordinates; a plane corresponding to the region to be eliminated in the second coordinate system can be determined as the given plane; and, the vertical projection coordinates of the candidate three-dimensional element on the given plane may be determined based on the second coordinates.
In particular, fig. 9 shows a schematic vertical projection of a candidate three-dimensional element according to an embodiment of the present disclosure.
In one embodiment, the second coordinate system may be the spatial coordinate system 900 of the three-dimensional world. As shown in fig. 1, the plane (or map plane) corresponding to the region to be removed 101 may be a ground plane in a three-dimensional world, and accordingly, as shown in fig. 9, under the spatial coordinate system 900, the plane corresponding to the region to be removed 902 may be determined as a plane (i.e., XOY plane) with z being 0. Further, the coordinates (i.e., second coordinates) of the candidate three-dimensional element 903 under the spatial coordinate system 900 may be acquired. In one embodiment, coordinates of one or more vertices of the candidate three-dimensional element 903 under the spatial coordinate system 900 may be acquired, for example, coordinates (x _ world, y _ world, z _ world) of the vertex 901 of the candidate three-dimensional element 903 under the spatial coordinate system 900 may be acquired as the coordinates of the candidate three-dimensional element 903 under the spatial coordinate system 900. In another embodiment, the coordinates of the centroid point (e.g., the top center point) 904 of the candidate three-dimensional element 903 in the spatial coordinate system 900 may also be obtained as the coordinates of the candidate three-dimensional element 903 in the spatial coordinate system 900. Hereinafter, the vertex 901 is described as an example. In one embodiment, the vertical projection coordinates of the candidate three-dimensional element 903 on a given plane (i.e., the XOY plane) may be determined based on the coordinates of the vertex 901. As shown in fig. 9, the vertical projection coordinate of vertex 901(x _ world, y _ world, z _ world) on the XOY plane (i.e., the coordinate of vertical projection point 901') may be determined as (x _ world, y _ world, 0).
Next, returning to fig. 7, in step S704, a first coordinate corresponding to the candidate three-dimensional element in the first coordinate system may be determined based on the vertically projected coordinate of the candidate three-dimensional element on the given plane.
In one embodiment, determining the corresponding first coordinate of the candidate three-dimensional element in the first coordinate system may include: determining a coordinate transformation relation between a first coordinate system and a second coordinate system; and performing coordinate transformation on the vertical projection coordinates based on the coordinate transformation relation to determine first coordinates corresponding to the vertical projection coordinates of the candidate three-dimensional elements in the first coordinate system.
In particular, fig. 10 shows a coordinate transformation process diagram 1000 according to an embodiment of the present disclosure.
As shown in fig. 10, the first coordinate system may be a two-dimensional texture coordinate system 1002, and the second coordinate system may be a three-dimensional spatial coordinate system 900. In one embodiment, as shown in fig. 10, a view projection relationship between spatial coordinate system 900 and projection coordinate system 1001, and a translation scaling relationship between projection coordinate system 1001 and texture coordinate system 1002 may first be determined, and then a coordinate transformation relationship between spatial coordinate system 900 and texture coordinate system 1002 may be determined based on the view projection relationship and the translation scaling relationship.
In one embodiment, determining the view projection relationship between spatial coordinate system 900 and projection coordinate system 1001 may comprise: shooting parameters of the virtual camera associated with the projection coordinate system 1001 and projection parameters of the projection plane associated with the projection coordinate system 1001 are determined, and a view projection relationship between the spatial coordinate system 900 and the projection coordinate system 1001 is determined based on the shooting parameters of the virtual camera and the projection parameters of the projection plane.
In particular, in the three-dimensional map rendering related to the present disclosure, the two-dimensional rendering of the three-dimensional map on the display screen is realized based on a certain viewing angle. That is, one virtual camera 1003 (or an observation eye) may be assumed under the spatial coordinate system 900, and a three-dimensional map is presented based on the shooting parameters of the virtual camera 1003. In one embodiment, the shooting parameters of the virtual camera 1003 may include a spatial position of the virtual camera 1003 under the spatial coordinate system 900, a shooting angle, and the like. In this way, the three-dimensional map can be converted from the spatial coordinate system 900 to a camera coordinate system (or eye coordinate system) based on the angle of view of the virtual camera 1003. Further, as shown in fig. 10, a projection plane 1007 may also be determined, and the camera coordinate system may be further converted into a projection coordinate system 1001 based on projection parameters of the projection plane 1007. In one embodiment, the projection parameters of the projection plane 1007 may include parameters such as a projection window position of the projection plane 1007. In this way, elements of the three-dimensional map that are at different distances from the projection plane 1007 may be projected onto the projection plane 1007 at different sizes, for example, achieving a projection effect of large and small distances. Thereby, the view projection relationship between the space coordinate system 900 and the projection coordinate system 1001 can be determined based on the shooting parameters of the virtual camera 1003 and the projection parameters of the projection plane 1007 as described above.
In one embodiment, a translation scaling relationship between projection coordinate system 1001 and texture coordinate system 1002 may also be determined. For example, in one embodiment, the x-axis and y-axis coordinate ranges of projection coordinate system 1001 may each be-1 to 1, while the s-axis and t-axis coordinate ranges of texture coordinate system 1002 may each be 0 to 1. Therefore, the translation scaling relationship between the projection coordinate system 1001 and the texture coordinate system 1002 can be determined based on the coordinate range ratio between the projection coordinate system 1001 and the texture coordinate system 1002, the origin position relationship, and the like, and the conversion between the projection coordinate system 1001 and the texture coordinate system 1002 can be realized based on the translation scaling relationship.
Further, based on the view projection relationship between the spatial coordinate system 900 and the projection coordinate system 1001, and the translation scaling relationship between the projection coordinate system 1001 and the texture coordinate system 1002 as described above, the coordinate transformation relationship between the spatial coordinate system 900 and the texture coordinate system 1002 can be determined.
In one embodiment, the view projection relationship between spatial coordinate system 900 and projection coordinate system 1001 may be a view projection matrix from spatial coordinate system 900 to projection coordinate system 1001. In this embodiment, the three-dimensional elements may be based on candidatesVertical projection coordinates 1004([ x _ world, y _ world, 0, 1)]T) And a third coordinate corresponding to the projection coordinate system 1001, namely projection coordinate system coordinate 1005([ x _ clip, y _ clip, z _ clip, w _ clip), of the view projection matrix generation vertical projection coordinate 1004]T) As shown in the following equation (1).
Figure BDA0002595726970000131
Taking the vertical projection coordinates of one vertex of the candidate three-dimensional element as an example, in equation (1), [ x _ world, y _ world, 0, 1]TMay be the vertical projection coordinates 1004 of the vertex in the spatial coordinate system 900 on the map plane or the plane corresponding to the region to be culled (i.e., the plane where z is 0). Here, a homogeneous coordinate form may be adopted, in which the first three dimensions are spatial coordinate system coordinates, and the fourth dimension may be a transformation dimension related to a graphic transformation such as translation, rotation, and scaling. In one embodiment, ViewMatrix may be a view matrix associated with the shooting parameters of the virtual camera 1003 as described above, and performatitematrix may be a perspective matrix associated with the projection parameters of the projection plane 1007 as described above, i.e., PerspectiveMatrix × ViewMatrix may represent a view projection matrix from the spatial coordinate system 900 to the projection coordinate system 1001. It should be understood that the perspective matrix and the ViewMatrix can be obtained in a manner well known in the art (not described in detail herein) or can be obtained in a manner developed in the future, and the technical solution of the present disclosure is not limited by the manner of generating the above matrix. In one embodiment, the view matrix may be determined and multiplied by the vertical projection coordinates 1004 for a first coordinate transformation, and then the result of the first coordinate transformation may be multiplied by the perspective matrix for a second coordinate transformation to achieve the secondary coordinate transformation from the spatial coordinate system 900 (i.e., [ x _ world, y _ world, 0, 1)]T) To projection coordinate system 1001 (i.e., [ x _ clip, y _ clip, z _ clip, w _ clip)]T) And (4) coordinate transformation. In another embodiment, a view projection matrix may be determined in advance based on parameters such as the shooting parameters of the virtual camera 1003 and the projection parameters of the projection plane 1007, and then directly determinedThe view projection matrix is multiplied by the vertical projection coordinates 1004 to achieve a coordinate transformation from the spatial coordinate system 900 to the projection coordinate system 1001.
In one embodiment, the translation scaling relationship between projection coordinate system 1001 and texture coordinate system 1002 may be a translation scaling matrix from projection coordinate system 1001 to texture coordinate system 1002. In this embodiment, a first coordinate, i.e., texture coordinate 1006, corresponding to the vertical projection coordinate 1004 in the texture coordinate system 1002 may also be generated based on the projection coordinate system coordinate 1005 corresponding to the candidate three-dimensional element in the projection coordinate system 1001 and the translation scaling matrix, as shown in the following equation (2).
Figure BDA0002595726970000141
In equation (2), [ x _ clip, y _ clip]TMay be projection coordinate system coordinates 1005 of a candidate three-dimensional element in a two-dimensional projection coordinate system 1001, which may be represented in a homogeneous form as x _ clip, y _ clip, 1]T. Matrix [0.5, 0, 0.5; 0, 0.5, 0.5; 0,0,1]That is, a translation scaling matrix from the projection coordinate system 1001 to the texture coordinate system 1002 may be determined based on the coordinate range ratio between the projection coordinate system 1001 and the texture coordinate system 1002, the origin position relationship, and the like, as described above. Multiplying the projection coordinate system coordinate 1005 and the translation scaling matrix to obtain a first coordinate corresponding to the candidate three-dimensional element in the texture coordinate system 1002, i.e. a texture coordinate 1006, whose homogeneous form is [ s, t, 1 ]]T
In another embodiment, the overall transformation matrix from spatial coordinate system 900 to texture coordinate system 1002 may be predetermined based on the view projection matrix from spatial coordinate system 900 to projection coordinate system 1001 and the translation scaling matrix from projection coordinate system 1001 to texture coordinate system 1002, and then the transformation of vertical projection coordinates 1004 from spatial coordinate system 900 to texture coordinate system 1002 may be implemented based on the overall transformation matrix.
Returning to fig. 7 again, finally, in step S705, it may be determined whether to reject the candidate three-dimensional element based on the first coordinate corresponding to the vertical projection coordinate and the reject matching template associated with the region to be rejected in the first coordinate system.
In one embodiment, since the culling matching template 810 generated in step S702 as described above and the texture coordinates 1006 of the candidate three-dimensional element determined in step S704 as described above are both in the same coordinate system (i.e., texture coordinate system), template matching may be performed directly based on the texture coordinates 1006 of the candidate three-dimensional element and the culling matching template 810. For example, a pixel value corresponding to the texture coordinates 1006 of the candidate element may be obtained in the culling matching template 810, and when the pixel value has a first pixel value (e.g., "0" as shown in fig. 8 a) indicating that the pixel should be discarded, the pixel may be discarded and the culling of the candidate voxel determined.
FIG. 11 illustrates a diagram 1100 for determining whether to cull a candidate three-dimensional element from a culling matching template, according to an embodiment of the disclosure. Specifically, fig. 11 visually shows a schematic diagram for determining whether to reject a candidate three-dimensional element from the texture image 820 as described above.
As shown in fig. 11, in the texture coordinate system, a candidate three-dimensional element 1103 located within the region to be culled 1101 may be determined as culled, and a candidate three-dimensional element 1102 located outside the region to be culled 1101 may be determined as retained.
In another embodiment, when the texture coordinates 1006 of the candidate three-dimensional element include texture coordinates of a plurality of vertices of the candidate three-dimensional element, it may be determined to cull the candidate three-dimensional element in a case where at least a part of a plurality of pixel values corresponding to the texture coordinates of the plurality of vertices is a first pixel value (e.g., 0). In another embodiment, when the texture coordinates 1006 of the candidate voxel include texture coordinates of a plurality of vertices of the candidate voxel, it may also be determined whether to reject the candidate voxel based on a proportion of a first pixel value among a plurality of pixel values corresponding to the texture coordinates of the plurality of vertices. For example, a determination threshold (e.g., 50%) may be set in advance, and assuming that the texture coordinates 1006 of the candidate three-dimensional element include texture coordinates of four vertices thereof, and the pixel value corresponding to only 1 vertex has the first pixel value indicating that the pixel should be discarded, the first pixel value proportion may be determined to be 1/4-25%, which is smaller than the preset threshold 50%, so that it may be determined not to reject the candidate three-dimensional element, even though the candidate three-dimensional element is already partially located in the region to be rejected. It should be understood that whether to cull the candidate three-dimensional element may also be determined according to any other suitable decision rule, which is not limited herein.
Next, fig. 12 shows an exemplary process flow 1200 of a three-dimensional map processing method 700 according to an embodiment of the disclosure.
The process flow 1200 may be performed based on OpenGL or WebGL. As shown in fig. 12, the process flow 1200 may be divided into a pre-processing phase and a base map rendering phase. In the preprocessing stage, first, a frame buffer associated with the texture image 820 may be created in step S1201 and the current processing context is bound to the frame buffer, and then, in step S1202, the color buffer of the frame buffer may be cleared or initialized with a first value (e.g., rgba (0,0,0, 0)). In step S1203, a polygon corresponding to the region to be rejected may be rendered in a color buffer based on the region to be rejected data (e.g., boundary information of the region to be rejected), e.g., a pixel point corresponding to the region to be rejected may be rendered as a second value (e.g., rgba (1,1,1,1)) in the color buffer. After the rendering is completed, the frame buffer may be unbound in step S1204 for subsequent other operations.
Next, in the base map rendering phase, first, in step S1205, three-dimensional coordinates of vertices of a specific candidate base map element may be calculated in a vertex shader of OpenGL or WebGL from base map data (e.g., including a plurality of candidate base map elements). Next, in step S1206, the model view projection matrix may be performed on the calculated three-dimensional coordinates of the vertices as described above. In this step, two model view projection matrices may be performed. The first is a transformation that preserves the z-axis offset of the vertex, and the vertex coordinates obtained through this transformation can be subjected to subsequent rendering processing such as primitive assembly in step S1207. Another model view projection matrix transformation is a z-axis unbiased transformation with vertices discarded. Based on this transformation relation, the projection coordinates of the vertices of the candidate base map elements in the projection coordinate system 1001 as shown in fig. 10 can be obtained. In step S1208, the projection coordinates may be passed into the fragment shader, and in step S1209, the texture coordinates of the projection coordinates under the texture coordinate system 1002 as shown in fig. 10 are calculated. Next, in step S1210, a template texture associated with the region to be culled (e.g., culling matching template 810) may be obtained from the color buffer generated by the pre-processing stage and the template pixel values read therefrom. In step S1211, it may be determined whether to discard the fragment based on the texture coordinates of the vertices of the candidate base map elements and their corresponding template pixel values, similar to step S705 shown in fig. 7. For example, in this embodiment, according to the rendering configuration of the preprocessing stage, if the corresponding template pixel value is rgba (1,1,1,1), the fragment may be considered to be located in the region to be culled, and the fragment may be discarded in step S1212; if the corresponding template pixel value is rgba (0,0,0,0), the fragment may be considered to be outside the region to be culled, the fragment may be retained, and subsequent rendering processing such as fragment coloring may be performed in step S1213.
By the three-dimensional map processing method, the base map elements in the area where the model to be superimposed is located can be accurately removed in the model superimposition scene. It should be understood that the three-dimensional map processing method according to the embodiment of the present disclosure may be applied not only to an application scenario of displaying an indoor map in an overlapping manner, but also to other similar scenarios such as building model replacement.
Fig. 13 shows a schematic diagram of a three-dimensional map processing apparatus 1300 according to an embodiment of the present disclosure.
As shown in fig. 13, a three-dimensional map processing apparatus 1300 according to an embodiment of the present disclosure may include: an acquisition module 1301, a template generation module 1302, a projection determination module 1303, a coordinate transformation module 1304, and a template matching module 1305. The obtaining module 1301 may be configured to obtain boundary information of an area to be removed in the three-dimensional map; the template generating module 1302 may be configured to generate a culling matching template associated with the to-be-culled area in a first coordinate system based on the boundary information; the projection determination module 1303 may be configured to determine, in the second coordinate system, vertical projection coordinates of candidate three-dimensional elements in the three-dimensional map on a given plane, where the given plane is a plane corresponding to the culling area in the second coordinate system; the coordinate transformation module 1304 may be configured to determine first coordinates corresponding to the candidate three-dimensional element in a first coordinate system based on the vertical projected coordinates of the candidate three-dimensional element on the given plane; and the template matching module 1305 may be configured to determine whether to reject the candidate three-dimensional element based on the first coordinate corresponding to the vertical projection coordinate and the reject matching template associated with the region to be rejected in the first coordinate system.
In one embodiment, under the first coordinate system, determining whether to reject the candidate three-dimensional element based on the first coordinate corresponding to the vertical projection coordinate and the rejection matching template associated with the region to be rejected may include: acquiring a pixel value corresponding to the first coordinate from the elimination matching template; and determining to cull the candidate three-dimensional element if at least a portion of the pixel values are the first pixel values.
Fig. 14 shows a schematic diagram of a three-dimensional map processing device 1400 according to an embodiment of the present disclosure.
As shown in fig. 14, a three-dimensional map processing device 1400 according to an embodiment of the present disclosure may include a processor 1401 and a memory 1402, which may be interconnected through a bus 1403.
The processor 1401 can perform various actions and processes according to a program or code stored in the memory 1402. In particular, processor 1401 may be an integrated circuit chip having signal processing capabilities. The processor may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, flows, and logic blocks disclosed in the embodiments of the disclosure may be implemented or performed. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, which may be the X86 architecture or the ARM architecture or the like.
The memory 1402 stores executable instructions that when executed by the processor 1401 are used to implement a three-dimensional map processing method according to an embodiment of the present disclosure. Memory 1402 may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), or flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM), Synchronous Link Dynamic Random Access Memory (SLDRAM), and direct memory bus random access memory (DR RAM). It should be noted that the memories of the methods described herein are intended to comprise, without being limited to, these and any other suitable types of memory.
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon computer-executable instructions that, when executed by a processor, may implement a three-dimensional map processing method according to an embodiment of the present disclosure. Similarly, computer-readable storage media in embodiments of the disclosure may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. It should be noted that the memories of the methods described herein are intended to comprise, without being limited to, these and any other suitable types of memory.
Embodiments of the present disclosure also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the three-dimensional map processing method according to the embodiment of the present disclosure.
The embodiment of the disclosure provides a three-dimensional map processing method, a three-dimensional map processing device, three-dimensional map processing equipment and a storage medium. The three-dimensional map processing method provided by the embodiment of the disclosure forms a binary template texture by rendering and eliminating the polygon of the area in the frame buffer, then converts the vertex coordinates of the rendering base map elements (such as buildings, points of interest (POI) and the like) into a texture coordinate system after the two-dimensionalization, and determines whether to discard the elements through texture pixels. In addition, the implementation processes of the method are all in the rendering stage, are decoupled from other modules, and can adapt to the dynamic change of the rendering target (or the rejection region).
It is to be noted that the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises at least one executable instruction for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In general, the various example embodiments of this disclosure may be implemented in hardware or special purpose circuits, software, firmware, logic or any combination thereof. Certain aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While aspects of embodiments of the disclosure have been illustrated or described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that the blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
The exemplary embodiments of the present disclosure described in detail above are merely illustrative, and not restrictive. It will be appreciated by those skilled in the art that various modifications and combinations of these embodiments or features thereof may be made without departing from the principles and spirit of the disclosure, and that such modifications are intended to be within the scope of the disclosure.

Claims (15)

1. A three-dimensional map processing method, comprising:
acquiring boundary information of a region to be eliminated in the three-dimensional map;
generating a rejecting matching template associated with the region to be rejected in a first coordinate system based on the boundary information;
determining a vertical projection coordinate of a candidate three-dimensional element in the three-dimensional map on a given plane under a second coordinate system, wherein the given plane is a plane corresponding to the area to be eliminated under the second coordinate system;
determining a first coordinate corresponding to the candidate three-dimensional element in the first coordinate system based on the vertical projection coordinate of the candidate three-dimensional element on the given plane; and
and under the first coordinate system, determining whether the candidate three-dimensional elements are eliminated or not based on the first coordinate corresponding to the vertical projection coordinate and an elimination matching template associated with the area to be eliminated.
2. The processing method of claim 1, wherein generating a culling matching template associated with the area to cull in a first coordinate system based on the boundary information comprises:
generating an initial matching template for the three-dimensional map in the first coordinate system, wherein the size of the initial matching template is determined based on the display size of the three-dimensional map, and the pixel resolution of the initial matching template is set according to a predetermined resolution; and
based on the boundary information of the region to be removed, in the initial matching template, setting pixels corresponding to the region to be removed as having a first pixel value, and setting the rest pixels as having a second pixel value different from the first pixel value, so as to generate the removal matching template.
3. The processing method of any of claims 1 or 2, wherein determining the vertical projection coordinates of the candidate three-dimensional elements in the three-dimensional map on the given plane comprises:
acquiring coordinates of the candidate three-dimensional elements in the second coordinate system as second coordinates;
determining a plane corresponding to the area to be eliminated in the second coordinate system as the given plane; and
determining vertical projection coordinates of the candidate three-dimensional element on the given plane based on the second coordinates.
4. The processing method of claim 3, wherein obtaining coordinates of the candidate three-dimensional element in the second coordinate system as second coordinates comprises:
and acquiring the coordinates of the vertexes of the candidate three-dimensional elements under the second coordinate system as the second coordinates.
5. The processing method of claim 3, wherein obtaining coordinates of the candidate three-dimensional element in the second coordinate system as second coordinates comprises:
and acquiring the coordinates of the centroid points of the candidate three-dimensional elements in the second coordinate system as the second coordinates.
6. The processing method of claim 1, wherein determining first coordinates of the candidate three-dimensional element corresponding to the first coordinate system comprises:
determining a coordinate transformation relationship between the first coordinate system and the second coordinate system; and
and performing coordinate transformation on the vertical projection coordinate based on the coordinate transformation relation to determine a first coordinate corresponding to the vertical projection coordinate of the candidate three-dimensional element in the first coordinate system.
7. The processing method of claim 6, wherein the first coordinate system is a texture coordinate system, the second coordinate system is a spatial coordinate system;
wherein determining a coordinate transformation relationship between the first coordinate system and the second coordinate system comprises:
determining a view projection relationship between the first coordinate system and a projection coordinate system;
determining a translational scaling relationship between the projection coordinate system and the second coordinate system; and
determining a coordinate transformation relationship between the first coordinate system and the second coordinate system based on the view projection relationship and the translation scaling relationship.
8. The processing method of claim 7, wherein determining the view projection relationship between the first coordinate system and the projection coordinate system comprises:
determining shooting parameters of a virtual camera, wherein the shooting parameters comprise a spatial position and a shooting angle of the virtual camera, and the shooting parameters of the virtual camera are associated with the projection coordinate system;
determining projection parameters of a projection plane, wherein the projection parameters comprise a projection window position of the projection plane, and the projection parameters of the projection plane are associated with the projection coordinate system; and
determining a view projection relationship between the first coordinate system and a projection coordinate system based on the shooting parameters of the virtual camera and the projection parameters of the projection plane.
9. The processing method of claim 7,
the view projection relationship is a view projection matrix from the first coordinate system to the projection coordinate system; the translation scaling relationship is a translation scaling matrix from the projection coordinate system to the second coordinate system;
wherein determining the corresponding first coordinate of the vertical projection coordinate of the candidate three-dimensional element in the first coordinate system comprises:
determining the first coordinate based on the vertical projection coordinate, the view projection matrix, and the translation scaling matrix of the candidate three-dimensional element.
10. The processing method of claim 9, wherein determining the first coordinate based on the vertical projection coordinate, the view projection matrix, and the translation scaling matrix of the candidate three-dimensional element comprises:
generating a third coordinate corresponding to the vertical projection coordinate in the projection coordinate system based on the vertical projection coordinate and the view projection matrix; and
and determining a first coordinate corresponding to the vertical projection coordinate of the candidate three-dimensional element in the first coordinate system based on the third coordinate and the translation scaling matrix.
11. The processing method of claim 1, wherein determining whether to reject the candidate three-dimensional element based on the first coordinate corresponding to the vertical projection coordinate and a reject matching template associated with the region to be rejected in the first coordinate system comprises:
acquiring a pixel value corresponding to the first coordinate from the eliminating matching template; and
determining to cull the candidate three-dimensional element if at least a portion of the pixel values are first pixel values.
12. A three-dimensional map processing apparatus comprising:
the acquisition module is used for acquiring boundary information of an area to be eliminated in the three-dimensional map;
the template generating module is used for generating a rejection matching template associated with the region to be rejected in a first coordinate system based on the boundary information;
the projection determination module is used for determining vertical projection coordinates of candidate three-dimensional elements in the three-dimensional map on a given plane in a second coordinate system, wherein the given plane is a plane corresponding to the area to be eliminated in the second coordinate system;
a coordinate transformation module, configured to determine, based on the vertical projection coordinates of the candidate three-dimensional element on the given plane, first coordinates corresponding to the candidate three-dimensional element in the first coordinate system; and
and the template matching module is used for determining whether the candidate three-dimensional elements are removed or not based on the first coordinate corresponding to the vertical projection coordinate and a removal matching template associated with the region to be removed in the first coordinate system.
13. The processing apparatus according to claim 12, wherein, in the first coordinate system, determining whether to reject the candidate three-dimensional element based on the first coordinate corresponding to the vertical projection coordinate and a reject matching template associated with the region to be rejected comprises:
acquiring a pixel value corresponding to the first coordinate from the eliminating matching template; and
determining to cull the candidate three-dimensional element if at least a portion of the pixel values are first pixel values.
14. A three-dimensional map processing apparatus comprising:
a processor; and
a memory having stored thereon computer-executable instructions for implementing the method of any one of claims 1-11 when executed by a processor.
15. A computer-readable storage medium having stored thereon computer-executable instructions for implementing the method of any one of claims 1-11 when executed by a processor.
CN202010708793.8A 2020-07-22 2020-07-22 Three-dimensional map processing method, device, equipment and storage medium Active CN111815788B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010708793.8A CN111815788B (en) 2020-07-22 2020-07-22 Three-dimensional map processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010708793.8A CN111815788B (en) 2020-07-22 2020-07-22 Three-dimensional map processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111815788A true CN111815788A (en) 2020-10-23
CN111815788B CN111815788B (en) 2022-05-17

Family

ID=72862120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010708793.8A Active CN111815788B (en) 2020-07-22 2020-07-22 Three-dimensional map processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111815788B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112307553A (en) * 2020-12-03 2021-02-02 之江实验室 Method for extracting and simplifying three-dimensional road model
CN114359456A (en) * 2021-12-27 2022-04-15 北京城市网邻信息技术有限公司 Picture pasting method and device, electronic equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140267236A1 (en) * 2013-03-15 2014-09-18 Janne Kontkanen System and Method for Approximating Cartographic Projections by Linear Transformation
WO2017014838A1 (en) * 2015-07-21 2017-01-26 Qualcomm Incorporated Zero pixel culling for graphics processing
CN107564089A (en) * 2017-08-10 2018-01-09 腾讯科技(深圳)有限公司 Three dimensional image processing method, device, storage medium and computer equipment
CN110796742A (en) * 2019-10-25 2020-02-14 西安建筑科技大学 Three-dimensional scene cone eliminating method based on object-oriented

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140267236A1 (en) * 2013-03-15 2014-09-18 Janne Kontkanen System and Method for Approximating Cartographic Projections by Linear Transformation
WO2017014838A1 (en) * 2015-07-21 2017-01-26 Qualcomm Incorporated Zero pixel culling for graphics processing
CN107851330A (en) * 2015-07-21 2018-03-27 高通股份有限公司 Zero pixel for graphics process is rejected
CN107564089A (en) * 2017-08-10 2018-01-09 腾讯科技(深圳)有限公司 Three dimensional image processing method, device, storage medium and computer equipment
CN110796742A (en) * 2019-10-25 2020-02-14 西安建筑科技大学 Three-dimensional scene cone eliminating method based on object-oriented

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YO-SEOP HWANG 等: "Noise removal of LRF for 3D map building using the superposition median filter", 《2012 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM)》 *
李鹏飞 等: "三维模型的数据处理与显示技术的设计与实现", 《航空电子技术》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112307553A (en) * 2020-12-03 2021-02-02 之江实验室 Method for extracting and simplifying three-dimensional road model
CN112307553B (en) * 2020-12-03 2024-04-16 之江实验室 Method for extracting and simplifying three-dimensional road model
CN114359456A (en) * 2021-12-27 2022-04-15 北京城市网邻信息技术有限公司 Picture pasting method and device, electronic equipment and readable storage medium
CN114359456B (en) * 2021-12-27 2023-03-24 北京城市网邻信息技术有限公司 Picture pasting method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN111815788B (en) 2022-05-17

Similar Documents

Publication Publication Date Title
JP4237271B2 (en) Method and apparatus for attribute interpolation in 3D graphics
EP1242966B1 (en) Spherical rectification of image pairs
US20180101932A1 (en) System and method for upsampling of sparse point cloud for 3d registration
CN108960229B (en) Multidirectional character detection method and device
CN106575448B (en) Image rendering of laser scan data
CN102077242B (en) Image generation device and method for super-resolution of 3d textures
CN111815788B (en) Three-dimensional map processing method, device, equipment and storage medium
CN111932673A (en) Object space data augmentation method and system based on three-dimensional reconstruction
CN101271588B (en) Recreatable geometric shade pattern method
Boulanger et al. ATIP: A Tool for 3D Navigation inside a Single Image with Automatic Camera Calibration.
Kuzmin et al. Polygon-based true orthophoto generation
EP4123106A1 (en) Image processing device, image processing method, and image processing program
CN113034347B (en) Oblique photography image processing method, device, processing equipment and storage medium
US6501481B1 (en) Attribute interpolation in 3D graphics
JP4099776B2 (en) 3D model creation device, 3D model creation method, and 3D model creation program
CN112132466A (en) Route planning method, device and equipment based on three-dimensional modeling and storage medium
Produit et al. An open tool to register landscape oblique images and generate their synthetic model.
Frommholz et al. Inlining 3d reconstruction, multi-source texture mapping and semantic analysis using oblique aerial imagery
Arslan 3D object reconstruction from a single image
Deepu et al. 3D Reconstruction from Single 2D Image
Bethmann et al. Object-based semi-global multi-image matching
Li et al. An occlusion detection algorithm for 3d texture reconstruction of multi-view images
US10600244B2 (en) Vertex optimization method using depth image in workspace modeling and system therefor
JP7195785B2 (en) Apparatus, method and program for generating 3D shape data
Eggert et al. Multi-layer visualization of mobile mapping data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40030153

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant