CN109544658B - Map rendering method and device, storage medium and electronic device - Google Patents

Map rendering method and device, storage medium and electronic device Download PDF

Info

Publication number
CN109544658B
CN109544658B CN201710860484.0A CN201710860484A CN109544658B CN 109544658 B CN109544658 B CN 109544658B CN 201710860484 A CN201710860484 A CN 201710860484A CN 109544658 B CN109544658 B CN 109544658B
Authority
CN
China
Prior art keywords
vertex
coordinate point
rendering
point
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710860484.0A
Other languages
Chinese (zh)
Other versions
CN109544658A (en
Inventor
李鸣
宋小建
陈明亮
肖旺裕
何朝阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710860484.0A priority Critical patent/CN109544658B/en
Publication of CN109544658A publication Critical patent/CN109544658A/en
Application granted granted Critical
Publication of CN109544658B publication Critical patent/CN109544658B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a map rendering method and device, a storage medium and an electronic device. Wherein, the method comprises the following steps: obtaining a rendering instruction, wherein the rendering instruction is used for at least indicating to render an inflection point between a first road section and a second road section in a map; searching a first coordinate point and a second coordinate point in the map in response to the rendering instruction, wherein a first edge where a first vertex on the first area is located intersects a second edge where a second vertex on the second area is located at the first coordinate point, and an extension line of a third edge where the first vertex on the first area intersects a extension line of a fourth edge where the second vertex on the second area is located intersects the second coordinate point; and rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex. The invention solves the technical problem of inaccurate boundary of the inflection point area of the road rendered on the map in the related technology.

Description

Map rendering method and device, storage medium and electronic device
Technical Field
The invention relates to the field of electronic maps, in particular to a map rendering method and device, a storage medium and an electronic device.
Background
The symbolization method of the geographic space entity is important research content in the geographic information field and the map making field. Line symbols tend to be more difficult and time consuming than the commonly used dot and face symbols due to their characteristic of requiring a fill drawing along the run of the line elements. The linear symbols required for map representation are more complex than the usual solid and dashed lines. For example, the linear symbols of the urban main road are double-line graphic elements with borders; the railway symbol is a black and white picture element with a frame; the single-sided boundary line symbol is a jagged primitive. The linear symbols are combined into a graphic element with semantic information by using different geometric shapes. The traditional drawing mode is that a corresponding drawing function is designed for each linear symbol through a specific function related to the symbol; another common way is to decompose the map symbols by using a combined drawing method, and draw the vector lines multiple times according to the decomposed primitives.
In a map rendering scene, the anti-aliasing rendering mode of the line-drawing elements can be realized in a texture mapping mode, and the anti-aliasing treatment is carried out on the textures to achieve the purpose of anti-aliasing of the line-drawing elements. At the joint of line segments (namely roads), fan-shaped display is simulated as much as possible by triangular fitting under different scale levels. Since the texture itself is a picture, the on-line drawing elements are in rendered scenes (such as high-precision maps) with different scales, which may result in certain unreal features. Moreover, by fitting a sector with a triangle, the end is not a circular arc in nature, and the purpose of the most real display cannot be achieved.
As shown in fig. 1, which shows a scheme of decomposing a map symbol such as by using a combined drawing manner and performing antialiasing rendering by performing paste filling in a triangular region, a road 1 and a road 2 have an intersection at an O point (inflection point), where an arc inflection point should be presented from a viewpoint, however, in practical applications, since the GPU is used to render in units of triangles, in order to adapt to the characteristics of the GPU, the knee area is often split into a plurality of triangles, such as the triangle OAB, OBC, OCD, and ODE shown in fig. 1, further transmitting the relevant data of each triangle to the GPU, rendering each triangle area, finally presenting a boundary composed of line segments AB, BC, CD and DE in the inflection point area, simulating fan-shaped display, but in a high-precision map, jagged boundaries can still be seen (while the actual boundary should be arc AE) as the user zooms in on the area looking.
Aiming at the technical problem that the boundary of the inflection point area of a road rendered on a map in the related art is inaccurate, an effective solution is not provided at present.
Disclosure of Invention
The embodiment of the invention provides a map rendering method and device, a storage medium and an electronic device, which are used for at least solving the technical problem that the boundary of an inflection point area of a road rendered on a map is inaccurate in the related technology.
According to an aspect of an embodiment of the present invention, there is provided a map rendering method, including: obtaining a rendering instruction, wherein the rendering instruction is used for at least indicating to render an inflection point between a first road section and a second road section in a map; in response to a rendering instruction, searching a first coordinate point and a second coordinate point in a map, wherein a first edge where a first vertex on a first area is located intersects a second edge where a second vertex on a second area is located at the first coordinate point, an extension line of a third edge where the first vertex on the first area intersects a extension line of a fourth edge where the second vertex on the second area is located intersects the second coordinate point, the first area is used for representing a first road section, and the second area is used for representing a second road section; and rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex.
According to an aspect of the embodiments of the present invention, there is also provided a map rendering apparatus, including: the obtaining unit is used for obtaining a rendering instruction, and the rendering instruction is used for at least indicating that an inflection point between a first road section and a second road section in a map is rendered; the response unit is used for responding to the rendering instruction, searching a first coordinate point and a second coordinate point in the map, wherein a first edge where a first vertex is located on the first area and a second edge where a second vertex is located on the second area intersect at the first coordinate point, an extension line of a third edge where the first vertex is located on the first area and an extension line of a fourth edge where the second vertex is located on the second area intersect at the second coordinate point, the first area is used for representing a first road section, and the second area is used for representing a second road section; and the rendering unit is used for rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex.
According to an aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program which, when executed, performs the above-described method.
According to an aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the above method through the computer program.
In the embodiment of the invention, when a rendering instruction for indicating at least rendering of an inflection point between a first road section and a second road section in a map is obtained, a first coordinate point and a second coordinate point are searched in the map, a first edge where a first vertex is located on a first area intersects a second edge where a second vertex is located on a second area at the first coordinate point, an extension line of a third edge where the first vertex is located on the first area intersects a fourth edge where the second vertex is located on the second area at the second coordinate point, the first area is used for representing the first road section, and the second area is used for representing the second road section; according to the first coordinate point, the second coordinate point, the first vertex and the second vertex, an arc line used for connecting the first vertex and the second vertex can be rendered, the technical problem that the boundary of the inflection point area of a road rendered on a map in the related technology is inaccurate can be solved, and the technical effect of accurately rendering the boundary of the inflection point area of the road on the map is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of an alternative map rendering according to the related art;
fig. 2 is a schematic diagram of a hardware environment of a rendering method of a map according to an embodiment of the present invention;
FIG. 3 is a flow chart of an alternative map rendering method according to an embodiment of the present invention;
FIG. 4 is a schematic illustration of an alternative road region according to an embodiment of the invention;
FIG. 5 is a schematic illustration of an alternative road region according to an embodiment of the invention;
FIG. 6 is a schematic illustration of an alternative road region according to an embodiment of the invention;
FIG. 7 is a schematic illustration of an alternative road region according to an embodiment of the invention;
FIG. 8 is a schematic illustration of an alternative road region according to an embodiment of the invention;
FIG. 9 is a schematic diagram of an alternative scribe factor attribute value in accordance with embodiments of the present invention;
FIG. 10 is a schematic illustration of an alternative road region according to an embodiment of the invention;
FIG. 11 is a schematic diagram of an alternative map rendering apparatus according to an embodiment of the present invention; and
fig. 12 is a block diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, partial terms or terms appearing in the description of the embodiments of the present invention are applied to the following explanations:
OpenGL (Open Graphics Library) refers to a specialized graphical program interface that defines a cross-programming language, cross-platform programming interface specification. The method is used for three-dimensional images (two-dimensional images can also be used), and is a bottom layer graphic library which is powerful and convenient to call.
GL _ LINES: each vertex is taken as an independent line segment, N line segments are defined between the vertices V (2N-1) and V (2N), and N/2 line segments are drawn. For example, n is 3, between V1 and V2, between V3 and V4, between V5 and V6 are connected in straight line, and between V2 and V3, and between V4 and V5 are not connected in straight line.
According to an embodiment of the present invention, a method embodiment of a rendering method of a map is provided.
Alternatively, in the present embodiment, the rendering method of the map may be applied to a hardware environment formed by the server 202 and the terminal 204 as shown in fig. 2. As shown in fig. 2, a server 202 is connected to a terminal 204 through a network including, but not limited to: the terminal 204 is not limited to a PC computer, a car terminal, a mobile terminal (e.g., a mobile phone, a tablet computer), etc. The map rendering method according to the embodiment of the present invention may be executed by the server 202, may be executed by the terminal 204, or may be executed by both the server 202 and the terminal 204. The terminal 204 may execute the rendering method of the map according to the embodiment of the present invention by a client installed thereon.
Such as when the terminal 104 executes the method of the present application, the following steps may be included:
in step S21, the terminal receives basic map data (e.g., a data packet of a current area where the terminal is located, a data packet required after a current map is enlarged, etc.) sent by the map server.
In step S22, the terminal performs rendering processing.
Step S221, in order to render at least an inflection point between a first road segment and a second road segment in a map, searching a first coordinate point and a second coordinate point in the map, wherein a first edge where a first vertex is located on a first area intersects a second edge where a second vertex is located on a second area at the first coordinate point, an extension line of a third edge where the first vertex is located on the first area intersects an extension line of a fourth edge where the second vertex is located on the second area at the second coordinate point, the first area is used for representing the first road segment, and the second area is used for representing the second road segment;
step S222, rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex.
In step S23, the result of rendering (including the map of the area where the inflection point is located, the status of the inflection point, and the like) is displayed on the map client.
As for the rendering method of the map in the step S22, fig. 3 is a flowchart of an alternative rendering method of the map according to an embodiment of the present invention, and as shown in fig. 3, the method may include the following steps:
step S302, a rendering instruction is obtained, wherein the rendering instruction is used for at least indicating to render the inflection point between the first road section and the second road section in the map.
The rendering instructions described above may be user-triggered, terminal-triggered, or server-triggered instructions. The rendering instruction can be used for indicating rendering of an inflection point between a first road segment and a second road segment in the map and can also be used for indicating rendering of parts, except the inflection point, of the map area needing to be displayed currently.
Step S304, responding to the rendering instruction, searching a first coordinate point and a second coordinate point in the map, wherein a first edge where a first vertex on the first area is located and a second edge where a second vertex on the second area is located intersect at the first coordinate point, an extension line of a third edge where the first vertex on the first area is located and an extension line of a fourth edge where the second vertex on the second area is located intersect at the second coordinate point, the first area is used for representing the first road section, and the second area is used for representing the second road section.
Step S306, rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex.
Through the steps S302 to S306, when a rendering instruction for instructing to render at least an inflection point between a first road segment and a second road segment in a map is obtained, a first coordinate point and a second coordinate point are searched in the map, a first edge where a first vertex is located on the first region intersects with a second edge where a second vertex is located on the second region at the first coordinate point, an extension line of a third edge where the first vertex is located on the first region intersects with an extension line of a fourth edge where the second vertex is located on the second region at the second coordinate point, the first region is used for representing the first road segment, and the second region is used for representing the second road segment; according to the first coordinate point, the second coordinate point, the first vertex and the second vertex, an arc line used for connecting the first vertex and the second vertex can be rendered, the technical problem that the boundary of the inflection point area of a road rendered on a map in the related technology is inaccurate can be solved, and the technical effect of accurately rendering the boundary of the inflection point area of the road on the map is achieved.
According to the technical scheme, each vertex of the triangular patch is preprocessed in the stage of preprocessing the on-line element to be the triangular patch, and alpha operation is carried out on the patch element at the joint of the line segments in the actual GPU rendering process, so that the purpose of anti-aliasing element rendering is achieved. Details are provided below in connection with the steps shown in fig. 3.
In the technical solution provided in step S302, a rendering instruction is obtained, where the rendering instruction for at least indicating that an inflection point between a first road segment and a second road segment in a map is rendered includes, but is not limited to, the following generation manner:
(1) the method comprises the following steps that the operation of a user in a map client side is performed, and the rendering instruction is triggered by map opening, area switching, map amplification and the like;
(2) rendering instructions triggered by operations for other clients, such as launching a map, viewing a location, and the like;
(3) when the map client is started and the position of a terminal where the map client is located is changed, the client triggers a generated rendering instruction;
(4) and the rendering instruction is triggered by an instruction sent to the map client by the map server.
In the technical solution provided in step S304, in response to the rendering instruction, a first coordinate point and a second coordinate point are searched in the map, where a first edge where the first vertex is located on the first region intersects with a second edge where the second vertex is located on the second region at the first coordinate point, an extension line of a third edge where the first vertex is located on the first region intersects with an extension line of a fourth edge where the second vertex is located on the second region at the second coordinate point, the first region is used for representing the first road segment, and the second region is used for representing the second road segment.
As shown in fig. 4, the first vertex D1 is one of two vertices at the end where the first region Q1 intersects the second region Q2, which is not overlapped with the second region; the second vertex D2 is one of the two vertices of the end where the second region Q2 intersects the first region Q1 that does not overlap the first region.
A first side B1 where the first vertex D1 is located on the first region Q1 intersects with a second side B2 where the second vertex D2 is located on the second region Q2 at a first coordinate point Z1, and an extension Y1 of a third side B3 where the first vertex is located on the first region Q1 intersects with an extension Y2 of a fourth side B4 where the second vertex is located on the second region Q2 at a second coordinate point Z2.
In the technical solution provided in step S306, an arc line for connecting the first vertex and the second vertex is rendered according to the first coordinate point, the second coordinate point, the first vertex and the second vertex.
Alternatively, the "rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex" in step S306 may be implemented by the following sub-steps:
step S3062, determining an arc line connecting the first vertex and the second vertex in the third region, where the third region includes the first vertex, the first coordinate point, the second vertex, and the second coordinate point, which are connected in sequence, that is, a region formed by four vertices D1, Z2, D2, and Z1.
In the embodiment shown in step S3062, as shown in fig. 5:
searching a third coordinate point Z3 at which an intersection point between the first perpendicular bisector C1 of the first edge and the second perpendicular bisector C2 of the second edge is located in the map; and finding a fourth coordinate point O on the target line segment B5, wherein the target line segment includes a third coordinate point Z3 and a second coordinate point Z2, and a distance between the fourth coordinate point and the first vertex is equal to a distance between the fourth coordinate point and the second vertex, that is, the fourth coordinate point is the center of a circle in which the arc line is located.
And determining an arc line according to the fourth coordinate point, and clockwise drawing the arc line by taking the fourth coordinate point as a circle center and taking the distance between the fourth coordinate point and the first vertex as a radius, wherein the arc line takes the vertex D1 as a starting point and the vertex D2 as a terminal point.
Step S3064, when rendering the third area, rendering the area outside the arc line in the third area (i.e., the area outside the circle where the arc line is located) in a transparent manner, and rendering the area inside the arc line in the third area (i.e., the area inside the circle where the arc line is located) in a non-transparent manner.
Optionally, the step S306 of rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex may further include the steps of:
step S3066, when rendering the first pixel point in the third region, rendering the first pixel point according to the transparency corresponding to the first distance between the first pixel point and the fourth coordinate point, where the first pixel point is located in a region within an arc line in the third region, and the fourth coordinate point is a coordinate point where the center of the arc line is located.
In this embodiment, a first threshold value may be obtained first, the first threshold value being smaller than the radius r of the arc; and under the condition that the first distance between the first pixel point and the fourth coordinate point is smaller than a first threshold value m1, rendering the first pixel point according to a first transparency, wherein the first transparency is one. And under the condition that the first distance d1 between the first pixel point and the fourth coordinate point is not less than the first threshold value m1, rendering the first pixel point according to the second transparency f1, wherein the difference value between the first distance and the first threshold value is inversely proportional to the second transparency. The calculation formula of the second transparency is as follows:
f1=1-(d1-m1)/(r-m1)。
in order to adapt to the rendering by the GPU in units of triangles, in an embodiment of step S306, when rendering an arc for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex: the third area can be divided into a first triangular area and a second triangular area, the first triangular area comprises a first vertex, a second coordinate point and a first coordinate point which are sequentially connected, and the second triangular area comprises the first coordinate point, the second coordinate point and a second vertex which are sequentially connected; sending vertex information (including D1, Z2 and Z1 so that a GPU can identify the first triangular area), vertex information (including Z1, Z2 and D2 so that the GPU can identify the second triangular area) of the second triangular area and alpha channel information to a graphics processor, rendering the first triangular area and the second triangular area by taking the triangular areas as units through the graphics processor, wherein the alpha channel information is used for indicating that an alpha channel value of a third pixel point is set to be zero so as to render an arc line, and the third pixel point is a pixel point of an area outside the arc line in the first triangular area and the second triangular area.
Optionally, when rendering an arc for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex, the area of the road except the inflection point may be rendered as follows:
acquiring a second threshold value m2, wherein the second threshold value is smaller than the distance d2 (shown in fig. 6) between the third edge and the first perpendicular bisector of the first edge; and under the condition that a second distance d3 between a second pixel point in the first area and the first perpendicular bisector is larger than a second threshold value, rendering the second pixel point according to a third transparency f2, wherein the difference value between the second distance and the second threshold value is inversely proportional to the third transparency. The third transparency f2 is calculated as follows:
f2=1-(d3-m2)/(d2-m2)。
as an alternative embodiment, the following describes an embodiment of the present application in detail by taking an example of displaying a navigation map on a mobile terminal.
In a mobile phone map rendering scene, drawing of a lot of line elements is included, in OpenGL, an option of "GL _ LINES" is provided to support line drawing, but it does not support processing at the intersection of line head and line, and the line width does not support shaping and the value cannot exceed 10 pixels. Therefore, the effect of using the rendering carried by the OpenGL in high-precision map rendering is not ideal. According to the method, the lines with the bandwidth values (if the bandwidth values are larger than 10 pixels) are split, the split lines are drawn into triangular patches, and the corners of the line segment joints are rounded, so that the purpose of anti-sawtooth rendering of line-dividing elements is achieved.
By adopting the technical scheme, the line-marking element (namely the road) can be split into a plurality of triangles after the pattern width of the line-marking element is obtained, and each triangle is provided with the geographic coordinate (namely the vertex coordinate) and the anti-aliasing attribute value. The triangle vertex data, including geographic coordinates, antialiasing properties (in the X-axis and Y-axis directions), is then organized for input to the first processing stage of the graphics rendering pipeline: vertex shaders, which create memory on the GPU for storing vertex data (i.e., vertex information). And then, after the processing of the vertex shader is finished, rasterizing the triangle into a triangle patch, transmitting the triangle patch into a fragment shader for processing, and performing anti-aliasing operation in the shader. And finally, uniformly rendering the GPU to a screen by using the result of the previous processing.
The process of rendering a single line element on a screen can be realized by the following steps:
in step S31, a triangle is split for the line segment (actually, the area where the line segment with the width value is located). The splitting of the online drawing element into the triangle process can be divided into the splitting of a line segment main body part and the splitting of a line segment joint.
As shown in fig. 7, the line segment parallel to the X axis is denoted as line segment D4D3, the unit vector parallel to the line segment and oriented in the X axis direction is denoted as a, the line segment inclined to the X axis is denoted as line segment D7D8, and the unit vector oriented in the opposite direction to the X axis is denoted as b. If the current line width is W, the width of the half line segment is W/2. Here, a × W/2 × (-1) and b × W/2 are added to obtain a Z1Z4 vector.
Then, in the case where coordinates of each vertex are known, the two line segments may be split into a triangle 1 (vertices D5, D1, and D4, respectively), a triangle 2 (vertices D4, D1, and Z4, respectively), a triangle 3 (vertices Z4, D1, and Z1, respectively), a triangle 4 (vertices Z4, Z1, and D2, respectively), a triangle 5 (vertices D7, Z4, and D2, respectively), a triangle 6 (vertices D7, D2, and D6, respectively), and then may be directly rendered by a triangle patch.
For the line segment junction, as shown in fig. 5, the coordinates of the point Z2 can be calculated by calculating the intersection point where the extension line of the line segment D5D1 and the line segment D6D2 intersect, and therefore, for the line segment junction, it is necessary to newly add the triangle 7 (vertices D1, Z2, Z1) and the triangle 8 (vertices Z1, Z2, D2).
It should be noted that, in the related art, at a line segment joint, depending on the angle of a corner and the size of the line width on the current scale, a circular arc is often fitted by a plurality of triangles (as shown in fig. 1), which increases the overhead of many extra triangles, and the joint is not a true circular arc, and the change is not natural.
By adopting the technical scheme of the application, the final effect is shown in fig. 8.
In step S32, the vertex is stored.
The vertices of the triangles 1 to 8 obtained above are stored in the vertex shader.
In step S33, the fragment is subjected to antialiasing processing.
After the result of the above-mentioned line element processing, the effect of drawing on the screen is shown in fig. 9, where the vertical axis y represents the attribute value, and anti-aliasing processing is also required, and there is no rounding at the line segment intersection, so that additional processing can be performed in the fragment shader.
After vertex information is transmitted to the vertex shader, OpenGL may autonomously interpolate through perspective correction before transmitting to the fragment shader, as shown in fig. 10, a triangular patch is broken up into a plurality of rasterized fragments, each fragment having an interpolated result.
For the main part of the line segment (road), the anti-aliasing is realized by alpha transparency gradual change at the upper part and the lower part of the line segment. For triangle 1, a threshold n2 is defined below point a, and a gradual change of alpha (i.e., transparency) is made in the region between line D1D5 and line L1, and the remaining portion remains alpha equal to 1 (i.e., opaque). Since only alpha gradient is performed in the Y-axis direction, the pre-stored antialiasing attributes of the vertex triangles are assigned, wherein the X-axis direction attributes are all 0, and the Y-axis direction attributes are respectively set to-1 and 1, wherein 1 represents the Y attribute value of the Z1 point and the D1 point. Thus, the triangle patch will only perform alpha interpolation in the Y-axis direction.
For the segment junction portion, as shown in fig. 5, for triangle 7 (including vertices D1, Z1, and Z2), the pre-stored antialiasing attribute of the vertex triangle is also assigned, Z1 point is assigned (0, 0), D1 point is assigned (0, 1), and Z2 point is assigned (n, 1), where the value of n is related to the current corner angle.
Thus, during rasterization before passing on to the fragment shader processing, all fragments that reach Z1 at a distance of 1 form a circular arc (arc), and thus in the fragment shader, an antialiasing threshold m1 (between 0-1) is also defined. For the value of the antialiased property of a fragment, its distance d1 to the center O (if the two roads are just orthogonal, O may be Z1) is calculated (this operation may be performed during the rasterization of the fragment).
If d1< m1, then the fragment alpha is 1, opaque;
if m1< d1<1, the fragment is subjected to linear interpolation through d1, the value range of alpha is 0-1, and the fragment gradually changes to be transparent;
if d1>1, then the fragment alpha is 0, transparent.
Finally, for line segment junctions, it behaves in the form of a circular arc.
For a single line, in a map rendering scene, anti-aliasing rendering can be realized through the implementation mode of the application, and the connection part of the line segment can be well processed, so that the requirement of the map rendering scene is met. By adopting the line-marking element rendering scheme, in a map rendering scene, line-marking element rendering is more real, and corners at line segment joints are more rounded.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
According to the embodiment of the invention, the map rendering device for implementing the map rendering method is also provided. Fig. 11 is a schematic diagram of an alternative map rendering apparatus according to an embodiment of the present invention, and as shown in fig. 11, the apparatus may include: an acquisition unit 1101, a response unit 1103, and a rendering unit 1105.
The obtaining unit 1101 is configured to obtain a rendering instruction, where the rendering instruction is configured to instruct to render at least an inflection point between a first road segment and a second road segment in a map.
The rendering instructions described above may be user-triggered, terminal-triggered, or server-triggered instructions. The rendering instruction can be used for indicating rendering of an inflection point between a first road segment and a second road segment in the map and can also be used for indicating rendering of parts, except the inflection point, of the map area needing to be displayed currently.
A response unit 1103, configured to, in response to the rendering instruction, search for a first coordinate point and a second coordinate point in the map, where a first edge where a first vertex on the first area is located intersects with a second edge where a second vertex on the second area is located at the first coordinate point, an extension line of a third edge where the first vertex on the first area intersects with an extension line of a fourth edge where the second vertex on the second area intersects with the second coordinate point, where the first area is used to represent the first road segment, and the second area is used to represent the second road segment.
A rendering unit 1105, configured to render an arc line connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex, and the second vertex.
It should be noted that the obtaining unit 1101 in this embodiment may be configured to execute step S302 in this embodiment, the responding unit 1103 in this embodiment may be configured to execute step S304 in this embodiment, and the rendering unit 1105 in this embodiment may be configured to execute step S306 in this embodiment.
It should be noted here that the modules described above are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the disclosure of the above embodiments. It should be noted that the modules described above as a part of the apparatus may operate in a hardware environment as shown in fig. 2, and may be implemented by software or hardware.
Through the module, a rendering instruction for indicating at least rendering of an inflection point between a first road section and a second road section in a map is obtained, a first coordinate point and a second coordinate point are searched in the map, a first edge where a first vertex is located on a first area and a second edge where the second vertex is located on a second area intersect at the first coordinate point, an extension line of a third edge where the first vertex is located on the first area and an extension line of a fourth edge where the second vertex is located on the second area intersect at the second coordinate point, the first area is used for representing the first road section, and the second area is used for representing the second road section; according to the first coordinate point, the second coordinate point, the first vertex and the second vertex, an arc line used for connecting the first vertex and the second vertex can be rendered, the technical problem that the boundary of the inflection point area of a road rendered on a map in the related technology is inaccurate can be solved, and the technical effect of accurately rendering the boundary of the inflection point area of the road on the map is achieved.
According to the technical scheme, each vertex of the triangular patch is preprocessed in the stage of preprocessing the on-line element to be the triangular patch, and alpha operation is carried out on the patch element at the joint of the line segments in the actual GPU rendering process, so that the purpose of anti-aliasing element rendering is achieved. The details are as follows.
The rendering unit may include: the determining module is used for determining an arc line which is used for connecting the first vertex and the second vertex in a third area, wherein the third area comprises the first vertex, the first coordinate point, the second vertex and the second coordinate point which are sequentially connected; and the first rendering module is used for rendering the region except the arc line in the third region in a transparent mode when rendering the third region, wherein the rendering is carried out in the transparent mode and is used for forming the arc line in the third region.
Optionally, the determining module may be further configured to: searching a third coordinate point where an intersection point between a first perpendicular bisector of the first edge and a second perpendicular bisector of the second edge is located in the map; searching a fourth coordinate point on the target line segment, wherein the target line segment comprises the third coordinate point and the second coordinate point, and the distance between the fourth coordinate point and the first vertex is equal to the distance between the fourth coordinate point and the second vertex; and determining an arc line according to the fourth coordinate point, wherein the arc line takes the fourth coordinate point as a circle center and takes the distance between the fourth coordinate point and the first vertex as a radius.
The rendering unit may further include: and the second rendering module is used for rendering the first pixel point according to the transparency corresponding to the first distance between the first pixel point and the fourth coordinate point when the first pixel point in the third region is rendered, wherein the first pixel point is located in the region within the arc line in the third region, and the fourth coordinate point is the coordinate point where the center of the arc line is located.
Optionally, the second rendering module is further configured to: acquiring a first threshold, wherein the first threshold is smaller than the radius of the arc; rendering the first pixel point according to a first transparency under the condition that a first distance between the first pixel point and the fourth coordinate point is smaller than a first threshold, wherein the first transparency is one; and under the condition that the first distance between the first pixel point and the fourth coordinate point is not less than a first threshold value, rendering the first pixel point according to a second transparency, wherein the difference value between the first distance and the first threshold value is inversely proportional to the second transparency.
Optionally, the rendering unit may further include: the dividing module is used for dividing the third area into a first triangular area and a second triangular area, wherein the first triangular area comprises a first vertex, a second coordinate point and a first coordinate point which are sequentially connected, and the second triangular area comprises a first coordinate point, a second coordinate point and a second vertex which are sequentially connected; and the sending module is used for sending the vertex information of the first triangular area, the vertex information of the second triangular area and alpha channel information to the graphics processor, and rendering the first triangular area and the second triangular area by taking the triangular areas as units through the graphics processor, wherein the alpha channel information is used for indicating that the alpha channel value of a third pixel point is set to be zero so as to render an arc line, and the third pixel point is a pixel point in an area except for the arc line in the first triangular area and the second triangular area.
Optionally, the rendering unit may further obtain a second threshold when rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex, where the second threshold is smaller than a distance between the third edge and the first perpendicular bisector of the first edge; and under the condition that a second distance between a second pixel point in the first area and the first perpendicular bisector is larger than a second threshold value, rendering the second pixel point according to a third transparency, wherein the difference value between the second distance and the second threshold value is inversely proportional to the third transparency.
For a single line, in a map rendering scene, anti-aliasing rendering can be realized through the implementation scheme of the application, and the connection part of the line segment can be well processed, so that the requirement of the map rendering scene is met. By adopting the line-marking element rendering scheme, in a map rendering scene, line-marking element rendering is more real, and corners at line segment joints are more rounded.
It should be noted here that the modules described above are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the disclosure of the above embodiments. It should be noted that the modules described above as a part of the apparatus may be run in a hardware environment as shown in fig. 2, may be implemented by software, and may also be implemented by hardware, where the hardware environment includes a network environment.
According to an embodiment of the present invention, a server or a terminal (i.e., an electronic device) for implementing the rendering method of the map is also provided.
Fig. 12 is a block diagram of a terminal according to an embodiment of the present invention, and as shown in fig. 12, the terminal may include: one or more processors 1201 (only one is shown in fig. 12), a memory 1203, and a transmission means 1205 (such as the transmission means in the above embodiments), as shown in fig. 12, the terminal may further include an input-output device 1207.
The memory 1203 may be used to store software programs and modules, such as program instructions/modules corresponding to the map rendering method and apparatus in the embodiment of the present invention, and the processor 1201 executes various functional applications and data processing by running the software programs and modules stored in the memory 1203, that is, implements the map rendering method described above. The memory 1203 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 1203 may further include memory located remotely from the processor 1201, which may be connected to the terminal through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The above-mentioned transmission means 1205 is used for receiving or sending data via a network, and may also be used for data transmission between the processor and the memory. Examples of the network may include a wired network and a wireless network. In one example, the transmission device 1205 includes a Network adapter (NIC) that can be connected to a router via a Network cable and other Network devices to communicate with the internet or a local area Network. In one example, the transmission device 1205 is a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
Among them, the memory 1203 is specifically used for storing an application program.
The processor 1201 may invoke an application stored in the memory 1203 via the transmission 1205 to perform the following steps:
obtaining a rendering instruction, wherein the rendering instruction is used for at least indicating that an inflection point between a first road segment and a second road segment in a map is rendered;
in response to a rendering instruction, searching a first coordinate point and a second coordinate point in a map, wherein a first edge where a first vertex on a first area is located intersects a second edge where a second vertex on a second area is located at the first coordinate point, an extension line of a third edge where the first vertex on the first area intersects a extension line of a fourth edge where the second vertex on the second area is located intersects the second coordinate point, the first area is used for representing a first road section, and the second area is used for representing a second road section;
and rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex.
The processor 1201 is further configured to perform the following steps:
dividing the third area into a first triangular area and a second triangular area, wherein the first triangular area comprises a first vertex, a second coordinate point and a first coordinate point which are sequentially connected, and the second triangular area comprises the first coordinate point, the second coordinate point and a second vertex which are sequentially connected;
sending the vertex information of the first triangular area, the vertex information of the second triangular area and alpha channel information to a graphics processor, and rendering the first triangular area and the second triangular area by taking the triangular areas as units through the graphics processor, wherein the alpha channel information is used for indicating that the alpha channel value of a third pixel point is set to be zero so as to render an arc line, and the third pixel point is a pixel point in an area outside the arc line in the first triangular area and the second triangular area.
By adopting the embodiment of the invention, when a rendering instruction for indicating at least rendering an inflection point between a first road section and a second road section in a map is obtained, a first coordinate point and a second coordinate point are searched in the map, a first edge where a first vertex is located on a first area intersects with a second edge where a second vertex is located on a second area at the first coordinate point, an extension line of a third edge where the first vertex is located on the first area intersects with an extension line of a fourth edge where the second vertex is located on the second area at the second coordinate point, the first area is used for representing the first road section, and the second area is used for representing the second road section; according to the first coordinate point, the second coordinate point, the first vertex and the second vertex, an arc line used for connecting the first vertex and the second vertex can be rendered, the technical problem that the boundary of the inflection point area of a road rendered on a map in the related technology is inaccurate can be solved, and the technical effect of accurately rendering the boundary of the inflection point area of the road on the map is achieved.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments, and this embodiment is not described herein again.
It can be understood by those skilled in the art that the structure shown in fig. 12 is only an illustration, and the terminal may be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, and a Mobile Internet Device (MID), a PAD, etc. Fig. 12 is a diagram illustrating a structure of the electronic device. For example, the terminal may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 12, or have a different configuration than shown in FIG. 12.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The embodiment of the invention also provides a storage medium. Alternatively, in the present embodiment, the storage medium may be a program code for executing a rendering method of a map.
Optionally, in this embodiment, the storage medium may be located on at least one of a plurality of network devices in a network shown in the above embodiment.
Optionally, in this embodiment, the storage medium is configured to store program code for performing the following steps:
s11, obtaining a rendering instruction, wherein the rendering instruction is used for at least indicating that an inflection point between a first road segment and a second road segment in a map is rendered;
s12, responding to a rendering instruction, searching a first coordinate point and a second coordinate point in a map, wherein a first edge where a first vertex on a first area is located intersects a second edge where a second vertex on a second area is located at the first coordinate point, an extension line of a third edge where the first vertex on the first area is located intersects an extension line of a fourth edge where the second vertex on the second area is located at the second coordinate point, the first area is used for representing a first road segment, and the second area is used for representing a second road segment;
and S13, rendering an arc line for connecting the first vertex and the second vertex according to the first coordinate point, the second coordinate point, the first vertex and the second vertex. (ii) a
Optionally, the storage medium is further arranged to store program code for performing the steps of:
s21, dividing the third area into a first triangular area and a second triangular area, wherein the first triangular area comprises a first vertex, a second coordinate point and a first coordinate point which are sequentially connected, and the second triangular area comprises a first coordinate point, a second coordinate point and a second vertex which are sequentially connected;
and S22, sending the vertex information of the first triangular area, the vertex information of the second triangular area and alpha channel information to a graphics processor, and rendering the first triangular area and the second triangular area by taking the triangular areas as units through the graphics processor, wherein the alpha channel information is used for indicating that the alpha channel value of a third pixel point is set to be zero so as to render an arc line, and the third pixel point is a pixel point in an area outside the arc line in the first triangular area and the second triangular area.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments, and this embodiment is not described herein again.
Optionally, in this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing one or more computer devices (which may be personal computers, servers, network devices, etc.) to execute all or part of the steps of the method according to the embodiments of the present invention.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (13)

1. A map rendering method, comprising:
obtaining a rendering instruction, wherein the rendering instruction is used for at least indicating that an inflection point between a first road segment and a second road segment in a map is rendered;
in response to the rendering instruction, searching a first coordinate point Z1 and a second coordinate point Z2 in the map, wherein a first side B1 of a first vertex D1 on a first region Q1 intersects with a second side B2 of a second region Q2 where a second vertex D2 on the second region Q2 at the first coordinate point Z1, an extension line of a third side of the first vertex D1 on the first region Q1 intersects with an extension line of a fourth side of the second vertex D2 on the second region Q2 at the second coordinate point Z2, the first region Q1 is used for representing the first road segment, and the second region Q2 is used for representing the second road segment;
rendering an arc for connecting the first vertex D1 and the second vertex D2 according to the first coordinate point Z1, the second coordinate point Z2, the first vertex D1 and the second vertex D2;
the arc is used to determine by:
searching for a third coordinate point Z3 in the map at which an intersection point between a first perpendicular bisector of the first side B1 and a second perpendicular bisector of the second side B2 exists;
finding a fourth coordinate point O on a target line segment, wherein the third coordinate point Z3 and the second coordinate point Z2 are included on the target line segment, and the distance between the fourth coordinate point O and the first vertex D1 is equal to the distance between the fourth coordinate point O and the second vertex D2;
determining the arc from the fourth coordinate point O, wherein the arc is centered at the fourth coordinate point O and has a radius of a distance between the fourth coordinate point O and the first vertex D1.
2. The method of claim 1, wherein rendering an arc connecting the first vertex D1 and the second vertex D2 according to the first coordinate point Z1, the second coordinate point Z2, the first vertex D1 and the second vertex D2 comprises:
determining the arc line connecting the first vertex D1 and the second vertex D2 in a third region, wherein the third region comprises the first vertex D1, the first coordinate point Z1, the second vertex D2 and the second coordinate point Z2 which are connected in sequence;
and when the third area is rendered, rendering areas, except for the arc line, in the third area in a transparent manner, wherein the rendering in the transparent manner is used for forming the arc line in the third area.
3. The method according to any one of claims 1 to 2, wherein rendering an arc for connecting the first vertex D1 and the second vertex D2 according to the first coordinate point Z1, the second coordinate point Z2, the first vertex D1 and the second vertex D2 comprises:
when a first pixel point in a third area is rendered, the first pixel point is rendered according to the transparency corresponding to a first distance between the first pixel point and a fourth coordinate point O, wherein the first pixel point is located in an area within the arc line in the third area, and the fourth coordinate point O is a coordinate point where the center of the arc line is located.
4. The method of claim 3, wherein rendering the first pixel point according to the transparency corresponding to the first distance between the first pixel point and the fourth coordinate point O comprises:
obtaining a first threshold, wherein the first threshold is smaller than the radius of the arc;
rendering the first pixel point according to a first transparency under the condition that a first distance between the first pixel point and the fourth coordinate point O is smaller than the first threshold, wherein the first transparency is an alpha channel value of one;
rendering the first pixel point according to a second transparency under the condition that a first distance between the first pixel point and the fourth coordinate point O is not smaller than the first threshold, wherein a difference value between the first distance and the first threshold is inversely proportional to the second transparency.
5. The method of claim 1, wherein rendering an arc connecting the first vertex D1 and the second vertex D2 according to the first coordinate point Z1, the second coordinate point Z2, the first vertex D1 and the second vertex D2 comprises:
dividing a third area into a first triangular area and a second triangular area, wherein the first triangular area comprises the first vertex D1, the second coordinate point Z2 and the first coordinate point Z1 which are connected in sequence, and the second triangular area comprises the first coordinate point Z1, the second coordinate point Z2 and the second vertex D2 which are connected in sequence;
sending vertex information of the first triangular area, vertex information of the second triangular area and alpha channel information to a graphics processor, and rendering the first triangular area and the second triangular area by taking the triangular areas as units through the graphics processor, wherein the alpha channel information is used for indicating that an alpha channel value of a third pixel point is set to be zero so as to render the arc line, and the third pixel point is a pixel point in an area except the arc line in the first triangular area and the second triangular area.
6. The method of claim 1, wherein when rendering an arc connecting the first vertex D1 and the second vertex D2 according to the first coordinate point Z1, the second coordinate point Z2, the first vertex D1 and the second vertex D2, the method further comprises:
obtaining a second threshold, wherein the second threshold is smaller than a distance between the third side and a first perpendicular bisector of the first side B1;
and under the condition that a second distance between a second pixel point in the first area Q1 and the first perpendicular bisector is greater than the second threshold, rendering the second pixel point according to a third transparency, wherein a difference value between the second distance and the second threshold is inversely proportional to the third transparency.
7. An apparatus for rendering a map, comprising:
the system comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a rendering instruction, and the rendering instruction is used for at least indicating that an inflection point between a first road section and a second road section in a map is rendered;
a response unit, configured to search, in response to the rendering instruction, a first coordinate point Z1 and a second coordinate point Z2 in the map, where a first side B1 of a first vertex D1 on a first region Q1 intersects a second side B2 of a second region Q2 where a second vertex D2 is located at the first coordinate point Z1, an extension line of a third side of the first vertex D1 on the first region Q1 intersects an extension line of a fourth side of the second vertex D2 on the second region Q2 at the second coordinate point Z2, the first region Q1 is used for representing the first road segment, and the second region Q2 is used for representing the second road segment;
a rendering unit, configured to render an arc line connecting the first vertex D1 and the second vertex D2 according to the first coordinate point Z1, the second coordinate point Z2, the first vertex D1, and the second vertex D2;
the arc is used to determine by:
searching for a third coordinate point Z3 in the map at which an intersection point between a first perpendicular bisector of the first side B1 and a second perpendicular bisector of the second side B2 exists;
finding a fourth coordinate point O on a target line segment, wherein the third coordinate point Z3 and the second coordinate point Z2 are included on the target line segment, and the distance between the fourth coordinate point O and the first vertex D1 is equal to the distance between the fourth coordinate point O and the second vertex D2;
determining the arc from the fourth coordinate point O, wherein the arc is centered at the fourth coordinate point O and has a radius of a distance between the fourth coordinate point O and the first vertex D1.
8. The apparatus of claim 7, wherein the rendering unit comprises:
a determining module, configured to determine the arc line in a third region for connecting the first vertex D1 and the second vertex D2, wherein the third region includes the first vertex D1, the first coordinate point Z1, the second vertex D2, and the second coordinate point Z2, which are connected in sequence;
and a first rendering module, configured to render, in a transparent manner, a region other than the arc line in the third region when rendering the third region, where the rendering in the transparent manner is used to form the arc line in the third region.
9. The apparatus according to any one of claims 7 to 8, wherein the rendering unit comprises:
and the second rendering module is used for rendering the first pixel point according to the transparency corresponding to the first distance between the first pixel point and a fourth coordinate point O when the first pixel point in a third area is rendered, wherein the first pixel point is located in the third area within the arc line, and the fourth coordinate point O is a coordinate point where the center of the arc line is located.
10. The apparatus of claim 9, wherein the second rendering module is further configured to:
obtaining a first threshold, wherein the first threshold is smaller than the radius of the arc;
rendering the first pixel point according to a first transparency under the condition that a first distance between the first pixel point and the fourth coordinate point O is smaller than the first threshold, wherein the first transparency is an alpha channel value of one;
rendering the first pixel point according to a second transparency under the condition that a first distance between the first pixel point and the fourth coordinate point O is not smaller than the first threshold, wherein a difference value between the first distance and the first threshold is inversely proportional to the second transparency.
11. The apparatus of claim 7, wherein the rendering unit comprises:
a dividing module, configured to divide a third area into a first triangular area and a second triangular area, where the first triangular area includes the first vertex D1, the second coordinate point Z2, and the first coordinate point Z1 that are connected in sequence, and the second triangular area includes the first coordinate point Z1, the second coordinate point Z2, and the second vertex D2 that are connected in sequence;
and the sending module is used for sending the vertex information of the first triangular area, the vertex information of the second triangular area and alpha channel information to a graphics processor, and rendering the first triangular area and the second triangular area by taking the triangular areas as units through the graphics processor, wherein the alpha channel information is used for indicating that an alpha channel value of a third pixel point is set to be zero so as to render the arc line, and the third pixel point is a pixel point in an area except for the arc line in the first triangular area and the second triangular area.
12. A storage medium, characterized in that the storage medium comprises a stored program, wherein the program when executed performs the method of any of the preceding claims 1 to 6.
13. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor executes the method of any of the preceding claims 1 to 6 by means of the computer program.
CN201710860484.0A 2017-09-21 2017-09-21 Map rendering method and device, storage medium and electronic device Active CN109544658B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710860484.0A CN109544658B (en) 2017-09-21 2017-09-21 Map rendering method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710860484.0A CN109544658B (en) 2017-09-21 2017-09-21 Map rendering method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN109544658A CN109544658A (en) 2019-03-29
CN109544658B true CN109544658B (en) 2022-03-25

Family

ID=65827998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710860484.0A Active CN109544658B (en) 2017-09-21 2017-09-21 Map rendering method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN109544658B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111197993B (en) * 2019-12-26 2021-11-23 广州文远知行科技有限公司 Map rendering method and device, computer equipment and storage medium
CN111121794B (en) * 2019-12-31 2022-01-04 广州文远知行科技有限公司 Map rendering method and device, terminal equipment and storage medium
CN113568992A (en) * 2020-04-28 2021-10-29 阿里巴巴集团控股有限公司 Rendering method and device for intersection area and electronic equipment
CN111797192B (en) * 2020-07-27 2023-09-01 平安科技(深圳)有限公司 GIS point data rendering method and device, computer equipment and storage medium
CN112597258B (en) * 2020-11-20 2022-03-04 上海莉莉丝网络科技有限公司 Splitting method and splitting system for dynamic area boundary in game map and computer readable storage medium
CN113532450B (en) * 2021-06-29 2024-04-30 广州小鹏汽车科技有限公司 Virtual parking map data processing method and system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5777618A (en) * 1993-07-29 1998-07-07 Digital Equipment Corporation Method and apparatus for graphical panning
DE19750171B4 (en) * 1996-11-12 2006-12-14 Honda Giken Kogyo K.K. Vehicle control system
CN101415079A (en) * 2008-11-04 2009-04-22 新奥特(北京)视频技术有限公司 Method for transforming subtitling object into Bessel curve
CN102157001A (en) * 2011-04-14 2011-08-17 中国测绘科学研究院 Method and system for drawing electronic map
CN102436678A (en) * 2010-09-29 2012-05-02 比亚迪股份有限公司 Method and system for generating three-dimensional road model
CN104536743A (en) * 2014-12-19 2015-04-22 中国电子科技集团公司第十五研究所 Map plotting method and system based on Android operating system
CN104599346A (en) * 2013-12-11 2015-05-06 腾讯科技(深圳)有限公司 Driving behavior evaluation method and driving behavior evaluation apparatus
CN106687891A (en) * 2014-09-15 2017-05-17 微软技术许可有限责任公司 Smooothing and GPU-enabled rendering of digital ink
CN106843239A (en) * 2017-04-11 2017-06-13 珠海市微半导体有限公司 Motion planning and robot control method based on map prediction

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5777618A (en) * 1993-07-29 1998-07-07 Digital Equipment Corporation Method and apparatus for graphical panning
DE19750171B4 (en) * 1996-11-12 2006-12-14 Honda Giken Kogyo K.K. Vehicle control system
CN101415079A (en) * 2008-11-04 2009-04-22 新奥特(北京)视频技术有限公司 Method for transforming subtitling object into Bessel curve
CN102436678A (en) * 2010-09-29 2012-05-02 比亚迪股份有限公司 Method and system for generating three-dimensional road model
CN102157001A (en) * 2011-04-14 2011-08-17 中国测绘科学研究院 Method and system for drawing electronic map
CN104599346A (en) * 2013-12-11 2015-05-06 腾讯科技(深圳)有限公司 Driving behavior evaluation method and driving behavior evaluation apparatus
CN106687891A (en) * 2014-09-15 2017-05-17 微软技术许可有限责任公司 Smooothing and GPU-enabled rendering of digital ink
CN104536743A (en) * 2014-12-19 2015-04-22 中国电子科技集团公司第十五研究所 Map plotting method and system based on Android operating system
CN106843239A (en) * 2017-04-11 2017-06-13 珠海市微半导体有限公司 Motion planning and robot control method based on map prediction

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Selective Anti-Aliasing for Virtual Reality Based on Saliency Map;Mankyu Sung等;《 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR)》;20170724;正文第16-19页 *
基于先验知识的GIS矢量边界更新方法研究;孙杨;《中国优秀博硕士学位论文全文数据库(硕士)-基础科学辑》;20080515(第05期);正文第2节,第8-19页 *

Also Published As

Publication number Publication date
CN109544658A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN109544658B (en) Map rendering method and device, storage medium and electronic device
CN107358649B (en) Processing method and device of terrain file
CN110211218B (en) Picture rendering method and device, storage medium and electronic device
US9275493B2 (en) Rendering vector maps in a geographic information system
US9495767B2 (en) Indexed uniform styles for stroke rendering
CN104268911B (en) The method and apparatus of route in map making
US8237745B1 (en) Label positioning technique to reduce crawling during zoom activities
JP5959637B2 (en) Rendering a text image that follows a line
CN108765520B (en) Text information rendering method and device, storage medium and electronic device
CN105574931A (en) Electronic map road drawing method and device
US10134171B2 (en) Graphics processing systems
US11087511B1 (en) Automated vectorization of a raster image using a gradient mesh with arbitrary topology
CN111127590B (en) Second-order Bezier curve drawing method and device
JP2016520920A (en) 2D curve tessellation using graphics pipeline
KR20050030569A (en) Image processing apparatus and method thereof
US20160307294A1 (en) Systems and Methods for Displaying Patterns of Recurring Graphics on Digital Maps
CN107038729B (en) Digital instrument panel drawing method based on OpenGL-ES
EP4231243A1 (en) Data storage management method, object rendering method, and device
US9779528B2 (en) Text realization
CN109427084B (en) Map display method, device, terminal and storage medium
CN111431953A (en) Data processing method, terminal, server and storage medium
US11989807B2 (en) Rendering scalable raster content
US20220414986A1 (en) Segmenting three-dimensional meshes in graphical applications based on detection of elongated shapes
CN113786616A (en) Indirect illumination implementation method and device, storage medium and computing equipment
US10339704B2 (en) Image data processing method in image processor and computer readable medium storing program therefor for rendering a texture based on a triangulation pattern

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant