CN113763522A - Map rendering method, device, equipment and medium - Google Patents

Map rendering method, device, equipment and medium Download PDF

Info

Publication number
CN113763522A
CN113763522A CN202111104795.7A CN202111104795A CN113763522A CN 113763522 A CN113763522 A CN 113763522A CN 202111104795 A CN202111104795 A CN 202111104795A CN 113763522 A CN113763522 A CN 113763522A
Authority
CN
China
Prior art keywords
line
area
point
edge
sideline
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111104795.7A
Other languages
Chinese (zh)
Inventor
崔盼盼
冯磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111104795.7A priority Critical patent/CN113763522A/en
Publication of CN113763522A publication Critical patent/CN113763522A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Abstract

The map rendering method provided by the embodiment of the application obtains a lane contour line for constructing a guide area from map data, determines the area range of the guide area to determine a space area needing guide area line rendering, and then performs image rendering on the area range of the guide area, so that the finally displayed map data has guide area lines conforming to actual roads, thereby effectively improving the reality of the map data and improving the user experience under the condition of not increasing the data volume and the difficulty of map acquisition. The application fields of the scheme include but are not limited to the fields of maps, navigation, intelligent transportation, intelligent travel, automatic driving and the like.

Description

Map rendering method, device, equipment and medium
Technical Field
The present disclosure relates generally to the field of data processing technologies, and more particularly, to a map rendering method, apparatus, device, and medium.
Background
In order to clearly express detailed geometric data of each flow guide line in the flow guide area in the map, a large amount of manpower and material resources are consumed for collection and maintenance during high-precision map manufacturing. Particularly, when data are acquired, the data acquisition device has a high requirement on the quality of acquired data, and if a data error does not meet the requirement, a gap exists between an internal flow guide area line and a flow guide area side line, so that the experience effect of a product is influenced.
Disclosure of Invention
In view of the above-mentioned defects or shortcomings in the prior art, it is desirable to provide a map rendering method, apparatus, device and medium, which can automatically generate a flow guide line according to a flow guide zone boundary line, and effectively reduce the map updating cost under the condition of ensuring the accuracy of the flow guide line.
In a first aspect, an embodiment of the present application provides a map rendering method, including:
acquiring a lane contour line for constructing a diversion area from map data, wherein the lane contour line comprises two adjacent non-parallel lanes and two adjacent side lines;
determining the area range of the flow guide area according to the two side lines;
and performing image rendering in the area range of the guide area to obtain map data containing a guide area line.
In a second aspect, an embodiment of the present application provides a map rendering apparatus, including:
the system comprises an acquisition module, a data processing module and a data processing module, wherein the acquisition module is used for acquiring a lane contour line for constructing a diversion area from map data, and the lane contour line comprises two adjacent non-parallel lanes and two adjacent side lines;
the determining module is used for determining the area range of the guiding area according to the two edges;
and the rendering module is used for rendering the image in the area range of the guide area to obtain the map data containing the guide area line.
In some embodiments, the two edges include a first edge and a second edge, and the determining module is configured to:
acquiring the head point and the tail point of the original first edge line and the original second edge line;
according to the head point and the tail point of the original first sideline and the original second sideline, carrying out length alignment on the original first sideline and the original second sideline to obtain a first sideline and a second sideline;
and taking the area enclosed by the first edge and the second edge as the area range of the flow guide area.
In some embodiments, the determining module is to:
aiming at each original edge line, respectively utilizing the head point and the tail point to perform vertical projection on the other edge line to obtain a first projection point corresponding to the head point and a second projection point corresponding to the tail point;
and respectively taking line segments obtained by intercepting the two groups of first projection points and the second projection points as the first edge line and the second edge line.
In some embodiments, the area range of the guiding area is formed by a first edge and a second edge, and the rendering module is configured to:
constructing a dividing line for the flow guide area, wherein the dividing line is used for determining the turning position of the flow guide area line;
determining positioning points of a plurality of guiding flow lines on the dividing line;
constructing the flow guide area line in the flow guide area according to the positioning point;
and performing width rendering on the guiding area line to obtain the rendered map data.
In some embodiments, the two edges include a first edge and a second edge, and the rendering module is to:
constructing an included angle center line of the first sideline and the second sideline;
acquiring a connecting line of a tail point of the first sideline and a tail point of the second sideline, and taking the center line of the included angle cut by the connecting line as an initial center line;
and correcting the initial central line to obtain a dividing line of the flow guide area.
In some embodiments, the rendering module is to:
selecting a plurality of second sampling points on the initial central line;
for each second sampling point, respectively making vertical projection on the first edge line and the second edge line to obtain a third projection point on the first edge line and a fourth projection point on the second edge line;
and connecting the connecting middle points of the third projection points and the fourth projection points corresponding to the plurality of second sampling points to obtain the dividing line of the flow guide area.
In some embodiments, the rendering module is to:
and determining a plurality of first sampling points on the segmentation line according to a preset rule, and taking each first sampling point as a positioning point of the diversion area line.
In some embodiments, the anchor point is a first sampling point on the partition line, and the rendering module is configured to:
for each first sampling point, constructing a first normal vector and a second normal vector of the first sampling point, wherein the first normal vector and the first edge are located on one side of the target central line, and the second normal vector and the second edge are located on the other side of the target central line;
acquiring a preset rotation angle;
controlling the first normal vector to rotate the rotation angle towards the first edge point direction of the first edge line to obtain a first intersection point of the first normal vector and the first edge line, and controlling the second normal vector to rotate the rotation angle towards the first edge point direction of the second edge line to obtain a second intersection point of the second normal vector and the second edge line;
and respectively connecting the first intersection point and the second intersection point by using the first sampling point to obtain the flow guide area line.
In some embodiments, the preset angle is an included angle formed by the first edge line and the second edge line, and the rendering module is configured to:
obtaining an included angle formed by the first sideline and the second sideline;
controlling the first normal vector to rotate the included angle towards the initial point direction of the first sideline to obtain a first intersection point of the first normal vector and the first sideline, and controlling the second normal vector to rotate the included angle towards the initial point direction of the second sideline to obtain a second intersection point of the second normal vector and the second sideline;
and respectively connecting the first intersection point and the second intersection point by using the first sampling point to obtain the flow guide area line.
In some embodiments, the intersection points of the diversion area line and the two edges are a first intersection point and a second intersection point, respectively, and the rendering module is configured to:
acquiring a preset width corresponding to the flow guide area line;
determining the width area of the flow guide area line by taking the first sampling point, the first intersection point and the second intersection point as centers respectively;
and carrying out area division on the width area, and carrying out filling rendering on the divided area to obtain a rendered flow guide area line.
In some embodiments, the rendering module is configured to perform width rendering on the stream guidance pipeline in a triangle rendering manner, and is configured to:
dividing the width region into at least one set of triangular regions;
and performing three-dimensional rendering on each triangular area to obtain a rendered flow guide area line.
In a third aspect, embodiments of the present application provide an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor executes the computer program to implement the method described in the embodiments of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the method as described in the embodiments of the present application.
According to the map rendering method provided by the embodiment of the application, the lane contour line for constructing the guide area is obtained from the map data, the area range of the guide area is determined, the space area needing the guide area line rendering is determined, then the image rendering is carried out on the area range of the guide area, so that the finally displayed map data has the guide area line which is consistent with the actual road, the reality of the map data is effectively improved under the condition that the data volume and the difficulty of map acquisition are not increased, and the user experience is improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a diagram illustrating an effect of a conventional map in the prior art;
fig. 2 is a schematic view of an application scene of a map rendering method according to an embodiment of the present application;
fig. 3 is a schematic view of an effect of a diversion area obtained by using the map rendering method provided in the embodiment of the present application;
fig. 4 is a flowchart of a map rendering method according to an embodiment of the present application;
fig. 5 is a flowchart of a map rendering method according to another embodiment of the present application;
FIG. 6 is a diagram illustrating boundary data of a stream guidance area according to an embodiment of the present disclosure;
FIG. 7 is a schematic view of a first edge and a second edge in an embodiment of the present application;
fig. 8 is a flowchart of a map rendering method according to another embodiment of the present application;
FIG. 9 is a schematic view of an initial centerline according to one embodiment of the present application;
fig. 10 is a flowchart of a map rendering method according to still another embodiment of the present application;
FIG. 11 is a schematic diagram of the initial centerline correction principle of one embodiment of the present application;
fig. 12 is a flowchart of a map rendering method according to still another embodiment of the present application;
FIG. 13 is a schematic diagram of the control of normal vector rotation in one embodiment of the present application;
fig. 14 is a flowchart of a map rendering method according to still another embodiment of the present application;
FIG. 15 is a schematic diagram illustrating the effect of determining a width region according to an embodiment of the present application;
FIG. 16 is a diagram illustrating the effect of triangulation in one embodiment of the present application;
fig. 17 is a flowchart of a map rendering method according to still another embodiment of the present application;
FIG. 18 is a schematic diagram illustrating a map rendering method according to yet another embodiment of the present application;
fig. 19 is a block diagram illustrating a map rendering apparatus according to still another embodiment of the present application;
fig. 20 shows a schematic structural diagram of a computer system suitable for implementing the electronic device or the server according to the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
The traditional navigation map is too simple to depict roads, as shown in fig. 1, the outline of the road is simply shown, a plurality of elements and scenes on the road are not described, the reduction degree of the real world is low, and the traditional navigation map has certain cost and poor experience when being used.
Compared with the road data of the traditional map, the high-precision map data has the advantages that the collection of the road data is more precise, the collection of the road data comprises various lanes of the road, various marking lines of the road surface, speed limit information and the like, the collection of the road data also comprises a guide line for prompting a user to drive according to an appointed route, the high-precision book can contain geometric data for describing the guide area in detail so as to describe the geometric data of each guide area line in the guide area, and the collected data is used for modeling and describing the guide area.
However, since the high-precision map data includes detailed aggregated data of each flow guiding line in the flow guiding area, a large amount of manpower and material resources are required to be consumed for data acquisition and data maintenance, the amount of map data is seriously increased, and the storage cost of a terminal user is further increased. In addition, in order to meet the drawing requirement of a high-precision map, the requirement on data acquisition is stricter, and the situation that a gap exists between a diversion area line and a diversion area side line is easy to occur, so that the experience of a map product is seriously influenced.
The rendering technology is an image processing technology which renders on the basis of an original image or a model, adds contents such as colors, illumination, shadows and the like, and enables an image finally displayed on a screen to have a vivid effect.
Based on the above, the map rendering method, device, equipment and medium are provided, and rendering of the guidance flow area lines can be performed according to the lane contour lines, so that the difficulty of map data acquisition is effectively reduced while the map meets the lane-level data depiction, and the cost of map drawing is reduced.
The embodiment of the application provides a map rendering method, a map rendering device, map rendering equipment and a storage medium.
The map rendering method provided by the embodiment of the application is mainly executed by a map rendering device, the rendering device of the map guiding area can be realized in a software and/or hardware mode, the map rendering device in the embodiment can be configured in the electronic equipment and can also be configured in a server for controlling the electronic equipment, and the server is communicated with the electronic equipment to further control the electronic equipment.
The electronic device can be a terminal, such as a mobile phone, a tablet computer, a notebook computer, a vehicle-mounted mobile terminal, an intelligent wearable device, and the like.
The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network server, cloud communication, middleware service, a domain name server, a security service, a CDN, and a big data and artificial intelligence platform, but is not limited thereto.
The map rendering method provided by the embodiment of the application can be used for rendering the map to be published by the server according to the map data provided by the map acquisition vehicle and then publishing the map to the terminal communicating with the server so as to facilitate navigation of a user according to the published map, and can also be used for rendering the diversion area of the map locally after the terminal receives the map published by the server so as to reduce the data volume downloaded from the server in communication.
Taking the server to execute the map rendering method provided in the embodiment of the present application as an example, as shown in fig. 2, the map processing system may include a terminal 1 and a server 2. The server 2 may be a server that provides the rendered map to the end user.
The diversion area is a V-shaped area formed by a lane on a main line and an entrance ramp or an exit ramp at the entrance and exit parts of highways such as expressways, national and provincial trunks and the like. The flow guide area is generally marked with zebra stripes or V-shaped lines, and the flow guide area is a white line with a certain width.
The server 2 receives data information for drawing a map, and draws a map model shown in fig. 1 by using the data information, and in the map model shown in fig. 1, the geographical position, the trend and the width of each lane in the map can be completely recorded through a lane contour line, so that accurate navigation information is provided for a user. However, such map information is different from real map data, and for example, if the position of the guidance area does not draw a guidance area line, it is easy for the user to "mistakenly see" the lane contour line, that is, to mistakenly consider the area formed between the adjacent contour lines of two lanes in the diverging state as a walkable lane.
Based on the above problems, the server 2 determines the area range of the guiding area in the map by using the lane contour line which is likely to cause mishaps, and then performs rendering of the guiding area line in the area range of the guiding area to obtain the map data containing the guiding area line, thereby realizing rendering of the map, making the rendered map data have the same content as the actual map data, and having a more real effect. Then, the server 2 distributes the rendered map, and the terminal 1 communicating with the server 2 receives the map distributed by the server 2 and performs operations such as navigation using the map. The map distributed by the server 2 is shown in fig. 3. Therefore, the authenticity of the displayed map can be effectively improved through the rendering mode under the condition that the collection difficulty of the map data is not increased, the misreading probability of the map data by a user is effectively reduced, and the driving safety of the vehicle is improved.
Fig. 4 is a flowchart of a map rendering method according to an embodiment of the present application. As shown in fig. 4, the map rendering method according to the embodiment of the present application includes the following steps:
step 101, obtaining a lane contour line for constructing a diversion area from map data, wherein the lane contour line comprises two adjacent non-parallel lanes and two adjacent side lines.
In one or more embodiments, a map area with an intersection or an entrance in the map data may be obtained by identifying the map data, and then the map area may be further identified to identify a lane contour line at the intersection or the entrance, and two adjacent side lines of two adjacent lanes may be used as lane contour lines for constructing the diversion area.
In the embodiment of the application, the map data may be a traditional navigation map or high-precision map data, and when the map data is the high-precision map data, the precision of the device for acquiring the high-precision map data to acquire the navigation area line is not limited.
And step 102, determining the area range of the guiding area according to the two edges.
It should be noted that the diversion area line is a cue line within the diversion area. The diversion area is arranged at the branching position of the two branching lanes and is not easy to overlong.
And 103, performing image rendering in the area range of the flow guide area to obtain map data containing the flow guide area line.
Specifically, according to the map rendering method provided by the embodiment of the application, the lane contour line for constructing the guide area is obtained from the map data, the area range of the guide area is determined, so that the space area needing the guide area line rendering is determined, then the image rendering is performed on the area range of the guide area, so that the finally displayed map data has the guide area line which is consistent with the actual road, and therefore the reality of the map data is effectively improved and the user experience is improved under the condition that the data volume and the difficulty of map acquisition are not increased.
In some embodiments, the two edges include an original first edge and an original second edge, as shown in fig. 5, and the determining 102 the area range of the guiding area according to the two edges includes:
step 1021, acquiring the first and tail points of the original first edge and the original second edge respectively.
It should be noted that after the map is used to draw the lane contour, two adjacent diversion lines forming the diversion area can be determined according to the attribute of each lane line in the map, and the diversion line segment has a head point and a tail point, where the directions of the head point of the original first edge line and the head point of the original second edge line are the same, and the directions of the tail points are the same.
In one or more embodiments, the high-precision map data represents the edge line data as a series of geometric points in three-dimensional space as shown in fig. 6.
In one or more embodiments, the initial points of the original first edge and the original second edge may coincide, as shown in fig. 7.
And 1022, aligning the lengths of the original first sideline and the original second sideline according to the head point and the tail point of the original first sideline and the original second sideline to obtain the first sideline and the second sideline.
It should be appreciated that aligning the original first edge and the original second edge in length can make the diversion area range more regular, i.e., the diversion area is not biased towards either of the two diverging lanes.
In one or more embodiments, the length-matching the original first edge and the original second edge according to the head point and the tail point of the original first edge and the original second edge to obtain the first edge and the second edge includes: and for each edge line, respectively utilizing the head point and the tail point to perform vertical projection on the other edge line to obtain a first projection point corresponding to the head point and a second projection point corresponding to the tail point, and respectively using the line segments intercepted from the two groups of first projection points and the second projection points as the first edge line and the second edge line.
Specifically, a first projection point corresponding to the original first edge line is obtained by vertically projecting a head point of the original first edge line to the original second edge line, and a second projection point corresponding to the original first edge line is obtained by vertically projecting a tail point of the original first edge line to the original second edge line; meanwhile, a first projection point corresponding to the original second edge line is obtained by vertically projecting the head point of the original second edge line to the original first edge line, and a second projection point corresponding to the original second edge line is obtained by vertically projecting the tail point of the original second edge line to the original first edge line.
In one or more embodiments, if the vertical projection point is on the extension line of the edge line, the closest point is selected as the projection point, for example, as shown in fig. 7 of the original first edge line and the original second edge line, the vertical projection point of the tail point of the original first edge line falls into point a on the original second edge line, at this time, point a may be directly used as the second projection point corresponding to the first edge line, and the vertical projection point of the tail point of the original second edge line falls into point B on the extension line of the original first edge line, at this time, the second projection point corresponding to the original second edge line is the closest point to point B on the original first edge line, that is, the tail point of the original first edge line.
Specifically, a line segment between a first projection point and a second projection point corresponding to the original first edge is taken as a final second edge, and a line segment between the first projection point and the second projection point corresponding to the original second edge is taken as a final first edge.
In step 1023, an area surrounded by the first edge and the second edge is used as an area range of the guiding area.
It should be understood that the vertical projection is truncated such that the lengths of the resulting first and second edges are substantially aligned, i.e. can enclose a more reasonable guiding region.
As a possible embodiment, the area range of the guide area is composed of a first edge and a second edge, as shown in fig. 8, step 103, performing image rendering on the area range of the guide area to obtain map data including the guide area line, including
Step 1031, constructing a dividing line for the flow guiding area, wherein the dividing line is used for determining the turning position of the flow guiding area line.
Since the diversion area line is generally V-shaped, in order to make the rendered diversion area line more consistent with the real situation, a dividing line located between the first edge line and the second edge line needs to be constructed, so that lines connecting the dividing line with the first edge line and the second edge line respectively can form a "V" shape.
And step 1032, determining positioning points of the multiple guiding areas on the dividing line.
That is to say, in the present application, a positioning point for determining a position of a flow guiding area line is selected on a dividing line in the flow guiding area, that is, the positioning point is used for determining the position of the flow guiding area line, and each positioning point corresponds to one flow guiding area line.
And 1033, constructing a flow guiding area line in the flow guiding area according to the positioning point.
The plurality of first sampling points can be selected on the dividing line by using a plurality of modes, for example, the dividing line can be divided equally according to the length of the dividing line and each equally divided point is used as a first sampling point. The proportion can be a proportional relation between a preset dividing line and the number of the first sampling points, for example, a mapping relation between the length of the dividing line and the number of the first sampling points is preset, then after the length of the dividing line is determined, the number of the first sampling points is determined through the mapping relation, and the dividing line is equally divided according to the number of the first sampling points; or the proportion can be determined according to the density of the flow guide area lines acquired by the high-precision map, for example, the density of the flow guide area lines in the flow guide area is determined according to the acquired data for drawing the high-precision map, then the dividing lines are equally divided according to the density, and the equally divided points are used as first sampling points.
The first sampling point is an anchor point of the flow guide area line.
And 1034, performing width rendering on the flow guide area line to obtain rendered map data.
That is to say, because the water conservancy diversion district line is the solid line that has certain width in reality to remind the driver to travel according to corresponding water conservancy diversion line for every car can all "the traffic is gone on its way", plays the effect of control, guide, warning traffic, thereby reduces the traffic accident. Therefore, in order to make the rendered flow guide area line more realistic, the flow guide area line needs to be rendered in width, so that the flow guide area in the map can be as shown in fig. 4.
Therefore, the map data containing the rendered flow guide area is finally obtained by segmenting the flow guide area, determining the positioning point of the flow guide area line, delineating the flow guide area line based on the positioning point of the flow guide area line and rendering the width of the flow guide area line, so that the rendered flow guide area line is adaptive to the shape of the flow guide area, and the authenticity of the map data is effectively improved.
In one or more embodiments, step 1031, constructing a partition line for the flow guiding region includes: and constructing an included angle central line of the first side line and the second side line, acquiring a connecting line of a tail point of the first side line and a tail point of the second side line, taking the included angle central line cut by the connecting line as an initial central line, and correcting the initial central line to obtain a dividing line of the flow guide area.
In one or more embodiments, the coordinates of the head point and the tail point of each edge line may be used to determine a unit direction vector corresponding to the edge line, that is, a vector having a length of 1 and pointing from the head point direction to the tail point direction. Specifically, a first unit direction vector with the length of 1 and pointing to the tail point of the first edge line from the head point of the first edge line is determined according to the coordinates of the head point and the tail point of the first edge line, and a second unit direction vector with the length of 1 and pointing to the tail point of the second edge line from the head point of the second edge line is determined according to the coordinates of the head point and the tail point of the second edge line.
And then, constructing an included angle center line of the first sideline and the second sideline by using the first unit direction vector and the second unit direction vector. As shown in fig. 9(a), when the first edges of the first and second edges are overlapped, the angular bisector of the first unit direction vector and the second unit direction vector can be directly used as the center line of the included angle, as shown in fig. 9(b), when the first edges of the first and second edges are not overlapped, the reverse extension lines of the first unit direction vector and the second unit direction vector can be made, and then the angular bisector according to the reverse extension lines is used as the center line of the included angle.
It should be noted that the center line of the included angle is not too long or too short, and the length is within the range of the diversion area, so that the center line of the included angle can be cut off by connecting the tail point of the first sideline with the tail point of the second sideline, and the line segment from the included angle to the cut-off position is used as the initial center line.
It should be understood that when the first edges of the first and second edges do not coincide, the cut-off position of the line connecting the first edge of the first edge and the first edge of the second edge to the included angle center line is the starting point of the initial center line.
As shown in fig. 10, the initial midline is corrected to obtain a dividing line of the flow guiding region, which includes:
step 201, a plurality of second sampling points are selected on the initial centerline.
The plurality of second sampling points may be selected from the initial middle line in various ways, for example, the initial middle line may be divided equally according to the length of the initial middle line and each divided point is taken as a second sampling point. The proportion can be a proportional relation between a preset initial central line and the number of second sampling points, for example, a mapping relation between the length of the initial central line and the number of the second sampling points is preset, then after the length of the initial central line is determined, the number of the second sampling points is determined through the mapping relation, and the initial central line is equally divided according to the number of the second sampling points; or the scale can be determined according to the density of the flow guide area lines acquired by the high-precision map, for example, the density of the flow guide area lines in the flow guide area is determined according to the acquired data for drawing the high-precision map, then the initial central line is equally divided according to the density, and the equally divided point is used as a second sampling point.
And step 202, for each second sampling point, respectively making vertical projections to the first edge line and the second edge line to obtain a third projection on the first edge line and a fourth projection on the second edge line.
Step 203, connecting the connecting middle points of the third projection points and the fourth projection points corresponding to the plurality of second sampling points to obtain the dividing line of the guiding area.
Specifically, as shown in fig. 11, six second sampling points are selected on the initial middle line, then, for each second sampling point, a vertical projection is performed on the first edge to obtain a third projection m 1-m 6 on the first edge, a vertical projection is performed on the second edge to obtain a fourth projection k 1-k 6 on the second edge, the third projection point corresponding to each second sampling point is connected with the fourth projection point, that is, m1 is connected with k1 to obtain L1, m2 is connected with k2 to obtain L2, …, m6 is connected with k6 to obtain L6, then, the middle points of L1-L6 are respectively obtained, and the middle points of L1-L6 are connected to obtain the target middle line.
Therefore, the target center can be ensured to be positioned in the middle of the first edge line and the second edge line, and the trend of the first edge line and the second edge line can be complied with, so that the trend of the guide area lines in the whole guide area is consistent when the guide area lines are constructed according to the target center line.
In one or more embodiments, as shown in fig. 12, the locating point is a first sampling point on the partition line, and according to the locating point, a flow guiding region line is constructed in the flow guiding region, including:
step 301, for each first sampling point, a first normal vector and a second normal vector of the first sampling point are constructed, the first normal vector and the first edge are located on one side of the target central line, and the second normal vector and the second edge are located on the other side of the target central line.
The normal vector is a unit vector perpendicular to the target central line, and specifically, the normal vectors may be respectively constructed on two sides of the target central line, and the normal vector on the same side as the first edge is used as the first normal vector, and the normal vector on the same side as the second edge is used as the second normal vector.
Step 302, acquiring a preset rotation angle.
The rotation angle is an included angle formed by the first sideline and the second sideline.
In one or more embodiments, the included angle formed by the first sideline and the second sideline may be directly obtained through map data, or the included angle formed by the first sideline and the second sideline may be determined through the included angle between the first unit direction vector of the first sideline and the second unit direction vector of the second sideline.
Step 303, controlling the first normal vector to rotate by the rotation angle toward the first point of the first edge line to obtain a first intersection point of the first normal vector and the first edge line, and controlling the second normal vector to rotate by the rotation angle toward the first point of the second edge line to obtain a second intersection point of the second normal vector and the second edge line.
The control method vector rotation is that the control method vector rotates around the intersection point of the control method vector and the target central line.
As shown in fig. 13, when the included angle between the first edge line and the second edge line is α, the first normal vector rotates to the head point of the first edge line by α degrees around the intersection point with the target central line, and the second normal vector rotates to the head point of the second edge line by α degrees around the intersection point with the target central line.
And 304, respectively connecting the first intersection point and the second intersection point by using the first sampling point to obtain a flow guiding area line.
From this, this application can draw out the unanimous water conservancy diversion district line of a plurality of angles in the water conservancy diversion district.
In one or more embodiments, the intersection point of the guiding area line and the first edge line is a first intersection point, and the intersection point of the guiding area line and the second edge line is a second intersection point, as shown in fig. 14, performing width rendering on the guiding area line to obtain a selected guiding area line, including:
step 401, obtaining a preset width corresponding to the flow guide area line.
The preset width of the guide area line may be a fixed width, that is, the width of the guide area line in the map is consistent, and may also be determined according to the length of any one of the first side line, the second side line, and the target center line, for example, a mapping relationship between the length of the target center line and the preset width may be preset, and after the length of the target center line is determined, the preset width corresponding to the guide area line is determined according to the mapping relationship.
It should be understood that the widths of the plurality of flow guide lines are uniform within the same flow guide.
And 402, determining the width area of the guide flow area line by taking the first sampling point, the first intersection point and the second intersection point as centers respectively.
That is, the determined first sampling point, the first intersection point and the second intersection point are the central positions of the flow guiding area lines.
In one or more embodiments, as shown in fig. 15, according to the preset width, a preset width is determined 1/2, and for any one of the flow guide area lines, the preset width is respectively extended 1/2 along two directions of a straight line where a first sampling point, a first intersection point and a second intersection point corresponding to the flow guide area line are located, so as to obtain a width area corresponding to the flow guide area line. Specifically, the first sampling point is extended 1/2 to the direction of the included angle of the first sideline and the second sideline for a preset width, and is extended 1/2 to the opposite direction of the included angle of the first sideline and the second sideline for a preset width, the first intersection point is respectively extended 1/2 to the direction of the head point and the direction of the tail point of the first sideline for a preset width, the second intersection point is respectively extended 1/2 to the direction of the head point and the direction of the tail point of the second sideline for a preset width, and the extended regions of the first sampling point, the first intersection point and the second intersection point are width regions corresponding to the diversion area lines.
And 403, performing area division on the width area, and performing filling rendering on the divided area to obtain a rendered flow guide area line.
In one or more embodiments, the width area of the flow guiding area line may be rendered stereoscopically by using an OpenGL rendering tool, optionally, the width area of the flow guiding area line is rendered widthwise by using a triangle rendering mode, specifically, the width area is divided into at least one group of triangle areas, and each triangle area is rendered stereoscopically to obtain the selected flow guiding area line.
It should be understood that, since the flow guiding area lines are rectangular lines, each rectangular area can be divided into a set of triangular areas sharing a side, and then each triangular area is subjected to stereo rendering, so that the width rendering of the flow guiding area lines is realized. For example, as shown in fig. 16, the width area of the stream guidance area that needs to be rendered may be divided into four triangular areas a, b, c, and d, and then the triangular areas a, b, c, and d are rendered stereoscopically by using an OpenGL rendering tool.
In one or more embodiments, as shown in fig. 17 and 18, a map rendering method includes:
step 501, obtaining edge line data of the guiding area from the map.
Wherein the edge data comprises an original first edge and an original second edge.
And 502, respectively utilizing the head point and the tail point of the original first edge line to make vertical projection to the original second edge line, respectively utilizing the head point and the tail point of the original second edge line to make vertical projection to the original first edge line, and determining the first edge line and the second edge line according to the vertical projection points.
Step 503, constructing a first unit direction vector of the first edge, constructing a second unit direction vector of the second edge, and constructing an initial centerline according to the first unit direction vector and the second unit direction vector.
In step 504, a plurality of second sampling points are selected from the initial centerline.
And 505, respectively performing vertical projection on the first edge and the second edge by using each second sampling point, and determining a target central line according to the middle point of a connecting line of the vertical projection points.
Step 506, a plurality of first sampling points are selected on the target centerline.
Step 507, constructing a first normal vector and a second normal vector based on each first sampling point, controlling the first normal vector to rotate the included angle between the first side line and the second side line towards the first point direction of the first side line, and controlling the second normal vector to rotate the included angle between the first side line and the second side line towards the first point direction of the second side line, so as to obtain a first intersection point of the first normal vector and the first side line and a second intersection point of the second normal vector and the second side line.
And step 508, controlling the first sampling point to extend for a preset width along the target central line, controlling the first intersection point to extend for the preset width along the first edge line, and controlling the second intersection point to extend for the preset width along the second edge line, so as to obtain a width area corresponding to the diversion area line.
In step 509, the width regions are triangulated, and each triangle region is rendered.
In summary, the map rendering method provided in the embodiment of the present application determines the area range of the guiding area by obtaining the lane contour line for constructing the guiding area from the map data, so as to determine the space area in which the guiding area line is to be rendered, and then performs image rendering on the area range of the guiding area, so that the finally displayed map data has the guiding area line that conforms to the actual road, thereby effectively improving the reality of the map data and improving the user experience without increasing the data volume and difficulty of map acquisition.
Fig. 19 is a block diagram illustrating a map rendering apparatus according to still another embodiment of the present application. An embodiment of the present application provides a map rendering apparatus, where the map rendering apparatus 10 includes:
the acquisition module 11 is configured to acquire a lane contour line for constructing a diversion area from map data, where the lane contour line includes two adjacent non-parallel lanes and two adjacent side lines;
a determining module 12, configured to determine an area range of the guiding area according to the two edges;
and the rendering module 13 is configured to perform image rendering within the area range of the guide area to obtain map data including a guide area line.
In some embodiments, the two edges include a first edge and a second edge, and the determining module 12 is configured to:
acquiring head points and tail points of an original first edge line and an original second edge line;
according to the head point and the tail point of the original first sideline and the original second sideline, carrying out length alignment on the original first sideline and the original second sideline to obtain a first sideline and a second sideline;
and taking the area enclosed by the first edge and the second edge as the area range of the flow guide area.
In some embodiments, the determination module 12 is configured to:
aiming at each original side line, respectively utilizing a head point and a tail point to perform vertical projection on the other side line to obtain a first projection point corresponding to the head point and a second projection point corresponding to the tail point;
and respectively taking the line segments intercepted by the two groups of first projection points and the second projection points as a first edge line and a second edge line.
In some embodiments, the area range of the guiding area is formed by a first edge and a second edge, and the rendering module 13 is configured to:
constructing a dividing line aiming at the flow guide area, wherein the dividing line is used for determining the turning position of the flow guide area line;
determining positioning points of a plurality of guiding flow lines on the dividing line;
constructing a flow guiding area line in the flow guiding area according to the positioning point;
and performing width rendering on the flow guide area line to obtain rendered map data.
In some embodiments, the two edges include a first edge and a second edge, and the rendering module 13 is configured to:
constructing an included angle central line of the first sideline and the second sideline;
acquiring a connecting line of a tail point of the first side line and a tail point of the second side line, and taking an included angle central line cut by the connecting line as an initial central line;
and correcting the initial central line to obtain a dividing line of the flow guide area.
In some embodiments, the rendering module 13 is configured to:
selecting a plurality of second sampling points on the initial central line;
for each second sampling point, respectively performing vertical projection on the first edge line and the second edge line to obtain a third projection point on the first edge line and a fourth projection point on the second edge line;
and connecting the connecting middle points of the third projection points and the fourth projection points corresponding to the plurality of second sampling points to obtain the dividing line of the flow guide area.
In some embodiments, the rendering module 13 is configured to:
and determining a plurality of first sampling points on the segmentation line according to a preset rule, and taking each first sampling point as an anchor point of a flow guiding line.
In some embodiments, the anchor point is a first sampling point on the partition line, and the rendering module is configured to:
constructing a first normal vector and a second normal vector of each first sampling point, wherein the first normal vector and the first edge are positioned on one side of the target central line, and the second normal vector and the second edge are positioned on the other side of the target central line;
acquiring a preset rotation angle;
controlling the first normal vector to rotate by a rotation angle towards the first point of the first sideline to obtain a first intersection point of the first normal vector and the first sideline, and controlling the second normal vector to rotate by a rotation angle towards the first point of the second sideline to obtain a second intersection point of the second normal vector and the second sideline;
and respectively connecting the first intersection point and the second intersection point by using the first sampling point to obtain a flow guiding area line.
In some embodiments, the rotation angle is an included angle formed by the first edge line and the second edge line, and the rendering module 13 is configured to:
obtaining an included angle formed by the first side line and the second side line;
controlling the first normal vector to rotate the included angle towards the first point direction of the first side line to obtain a first intersection point of the first normal vector and the first side line, and controlling the second normal vector to rotate the included angle towards the first point direction of the second side line to obtain a second intersection point of the second normal vector and the second side line;
and respectively connecting the first intersection point and the second intersection point by using the first sampling point to obtain a flow guiding area line.
In some embodiments, the intersection points of the guiding region line and the two edge lines are a first intersection point and a second intersection point, respectively, and the rendering module 13 is configured to:
acquiring a preset width corresponding to the flow guide area line;
respectively taking the first sampling point, the first intersection point and the second intersection point as centers, and determining the width area of the flow guide area line;
and carrying out area division on the width area, and carrying out filling rendering on the divided area to obtain a rendered flow guide area line.
In some embodiments, the width rendering is performed on the flow guiding region line by a triangle rendering method, and the rendering module 13 is configured to:
dividing the width area into at least one group of triangular areas;
and performing three-dimensional rendering on each triangular area to obtain a rendered flow guide area line.
To sum up, the map rendering apparatus provided in the embodiment of the present application determines the area range of the guiding area by obtaining the lane contour line for constructing the guiding area from the map data, so as to determine the space area in which the guiding area line is to be rendered, and then performs image rendering on the area range of the guiding area, so that the finally displayed map data has the guiding area line that conforms to the actual road, thereby effectively improving the authenticity of the map data and improving the user experience without increasing the data volume and difficulty of map acquisition.
It should be understood that the units or modules recited in the map rendering apparatus 10 correspond to the various steps in the method described with reference to fig. 3. Thus, the operations and features described above for the method are equally applicable to the map rendering apparatus 10 and the units included therein, and will not be described again here. The map rendering apparatus 10 may be implemented in a browser or other security applications of the electronic device in advance, or may be loaded into the browser or other security applications of the electronic device by downloading or the like. Corresponding units in the map rendering apparatus 10 may cooperate with units in the electronic device to implement the solution of the embodiment of the present application.
The division into several modules or units mentioned in the above detailed description is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
It should be noted that details that are not disclosed in the rendering apparatus of the map guiding area in the embodiment of the present application refer to details disclosed in the above embodiments of the present application, and are not repeated herein.
Referring now to fig. 20, fig. 20 illustrates a schematic diagram of a computer system suitable for use in implementing an electronic device or server according to embodiments of the present application,
as shown in fig. 20, the computer system 2000 includes a Central Processing Unit (CPU)2001, which can execute various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)2002 or a program loaded from a storage section 2008 into a Random Access Memory (RAM) 2003. In the RAM2003, various programs and data necessary for operation instructions of the system are also stored. The CPU2001, ROM2002, and RAM2003 are connected to each other via a bus 2004. An input/output (I/O) interface 2005 is also connected to bus 2004.
The following components are connected to the I/O interface 2005; an input portion 2006 including a keyboard, a mouse, and the like; an output portion 2007 including a display device such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 2008 including a hard disk and the like; and a communication section 2009 including a network interface card such as a LAN card, a modem, or the like. The communication section 2009 performs communication processing via a network such as the internet. Drive 2010 is also connected to I/O interface 2005 as needed. A removable medium 2011 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 2010 as necessary, so that a computer program read out therefrom is mounted in the storage section 2008 as necessary.
In particular, according to embodiments of the present application, the process described above with reference to the flowchart fig. 2 may be implemented as a computer software program. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program comprises program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 2009, and/or installed from the removable medium 2011. The above-described functions defined in the system of the present application are executed when the computer program is executed by the Central Processing Unit (CPU) 2001.
It should be noted that the computer readable medium shown in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operational instructions of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present application may be implemented by software or hardware. The described units or modules may also be provided in a processor, and may be described as: a processor includes an acquisition module, a determination module, a construction module, and a rendering module. The names of these units or modules do not in some cases form a limitation on the units or modules themselves, for example, the obtaining module may also be described as "obtaining an edge of a guiding area from a map, the edge including a first edge and a second edge".
As another aspect, the present application also provides a computer-readable storage medium, which may be included in the electronic device described in the above embodiments, or may exist separately without being assembled into the electronic device. The computer-readable storage medium stores one or more programs that when executed by one or more processors perform the map rendering methods described herein.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the disclosure. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (14)

1. A map rendering method, comprising:
acquiring a lane contour line for constructing a diversion area from map data, wherein the lane contour line comprises two adjacent non-parallel lanes and two adjacent side lines;
determining the area range of the flow guide area according to the two side lines;
and rendering the image based on the area range of the guide area to obtain the map data containing the guide area line.
2. The method of claim 1, wherein the two edges comprise an original first edge and an original second edge, and wherein determining the area range of the guiding area according to the two edges comprises:
respectively acquiring a head point and a tail point of the original first sideline and the original second sideline;
according to the head point and the tail point of the original first sideline and the original second sideline, carrying out length alignment on the original first sideline and the original second sideline to obtain a first sideline and a second sideline;
and taking the area enclosed by the first edge and the second edge as the area range of the flow guide area.
3. The method of claim 2, wherein the length-aligning the original first edge and the original second edge according to the start point and the end point of the original first edge and the original second edge to obtain a first edge and a second edge comprises:
aiming at each original edge line, respectively utilizing the head point and the tail point to perform vertical projection on the other edge line to obtain a first projection point corresponding to the head point and a second projection point corresponding to the tail point;
and respectively taking line segments obtained by intercepting the two groups of first projection points and the second projection points as the first edge line and the second edge line.
4. The method of claim 1, wherein the area range of the guiding area is composed of a first edge and a second edge, and the image rendering based on the area range of the guiding area to obtain the map data containing the guiding area line comprises:
constructing a dividing line for the flow guide area, wherein the dividing line is used for determining the turning position of the flow guide area line;
determining positioning points of a plurality of guiding flow lines on the dividing line;
constructing the flow guide area line in the flow guide area according to the positioning point;
and performing width rendering on the guiding area line to obtain the rendered map data.
5. The method of claim 4, wherein constructing a partition line for the flow guiding region comprises:
constructing an included angle center line of the first sideline and the second sideline;
acquiring a connecting line of a tail point of the first sideline and a tail point of the second sideline, and taking the center line of the included angle cut by the connecting line as an initial center line;
and correcting the initial central line to obtain a dividing line of the flow guide area.
6. The method of claim 5, wherein said correcting said initial centerline to obtain a split line of said flow guiding region comprises:
selecting a plurality of second sampling points on the initial central line;
for each second sampling point, respectively making vertical projection on the first edge line and the second edge line to obtain a third projection point on the first edge line and a fourth projection point on the second edge line;
and connecting the connecting middle points of the third projection points and the fourth projection points corresponding to the plurality of second sampling points to obtain the dividing line of the flow guide area.
7. The method of claim 4, wherein determining the anchor points of the plurality of pilot flow lines on the segmentation line comprises:
and determining a plurality of first sampling points on the segmentation line according to a preset rule, and taking each first sampling point as a positioning point of the diversion area line.
8. The method of claim 4, wherein the anchor point is a first sampling point on the partition line, and the constructing the flow guide area line in the flow guide area according to the anchor point comprises:
for each first sampling point, constructing a first normal vector and a second normal vector of the first sampling point, wherein the first normal vector and the first edge are located on one side of the target central line, and the second normal vector and the second edge are located on the other side of the target central line;
acquiring a preset rotation angle;
controlling the first normal vector to rotate the rotation angle towards the first edge point direction of the first edge line to obtain a first intersection point of the first normal vector and the first edge line, and controlling the second normal vector to rotate the rotation angle towards the first edge point direction of the second edge line to obtain a second intersection point of the second normal vector and the second edge line;
and respectively connecting the first intersection point and the second intersection point by using the first sampling point to obtain the flow guide area line.
9. The method of claim 8, wherein the angle of rotation is an included angle formed by the first edge and the second edge, the method further comprising:
obtaining an included angle formed by the first sideline and the second sideline;
controlling the first normal vector to rotate the included angle towards the initial point direction of the first sideline to obtain a first intersection point of the first normal vector and the first sideline, and controlling the second normal vector to rotate the included angle towards the initial point direction of the second sideline to obtain a second intersection point of the second normal vector and the second sideline;
and respectively connecting the first intersection point and the second intersection point by using the first sampling point to obtain the flow guide area line.
10. The method of claim 4, wherein the intersection of the stream guidance area line and the first edge is a first intersection and the intersection of the stream guidance area line and the second edge is a second intersection, and wherein the width rendering of the stream guidance area line to obtain the rendered stream guidance area comprises:
acquiring a preset width corresponding to the flow guide area line;
determining the width area of the flow guide area line by taking the first sampling point, the first intersection point and the second intersection point as centers respectively;
and carrying out area division on the width area, and carrying out filling rendering on the divided area to obtain a rendered flow guide area line.
11. The method according to claim 10, wherein performing width rendering on the guiding flow area line by using a triangle rendering method, performing area division on the width area, and performing filling rendering on the divided area to obtain a rendered guiding flow area, comprises:
dividing the width region into at least one set of triangular regions;
and performing three-dimensional rendering on each triangular area to obtain a rendered flow guide area line.
12. A map rendering apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring a lane contour line for constructing a diversion area from map data, wherein the lane contour line comprises two adjacent side lines on two adjacent non-parallel lanes;
the determining module is used for determining the area range of the guiding area according to the two edges;
and the rendering module is used for rendering the image in the area range of the guide area to obtain the map data containing the guide area line.
13. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor, when executing the program, implements a map rendering method as claimed in any one of claims 1-11.
14. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a map rendering method according to any one of claims 1 to 11.
CN202111104795.7A 2021-09-18 2021-09-18 Map rendering method, device, equipment and medium Pending CN113763522A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111104795.7A CN113763522A (en) 2021-09-18 2021-09-18 Map rendering method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111104795.7A CN113763522A (en) 2021-09-18 2021-09-18 Map rendering method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN113763522A true CN113763522A (en) 2021-12-07

Family

ID=78796586

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111104795.7A Pending CN113763522A (en) 2021-09-18 2021-09-18 Map rendering method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN113763522A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114427858A (en) * 2022-01-29 2022-05-03 智道网联科技(北京)有限公司 Map diversion strip generation method, device and equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114427858A (en) * 2022-01-29 2022-05-03 智道网联科技(北京)有限公司 Map diversion strip generation method, device and equipment
CN114427858B (en) * 2022-01-29 2024-04-16 智道网联科技(北京)有限公司 Map guide belt generation method, device and equipment

Similar Documents

Publication Publication Date Title
US20210001877A1 (en) Determination of lane connectivity at traffic intersections for high definition maps
EP3505869B1 (en) Method, apparatus, and computer readable storage medium for updating electronic map
CN111462275B (en) Map production method and device based on laser point cloud
CN108961990B (en) Method and apparatus for processing high-precision map
EP3951321A1 (en) Method and system for rapid generation of reference driving route, terminal and storage medium
CN111142525A (en) High-precision map lane topology construction method and system, server and medium
CN111238504B (en) Road segment modeling data generation method and device of road map and related system
CN111238502B (en) Road map generation method, device and related system
CN112033420A (en) Lane map construction method and device
CN111238498B (en) Road map generation method, device and related system for lane-level display
CN116740667B (en) Intersection surface data generation method and device, electronic equipment and storage medium
CN111238503B (en) Map generation method, device and related system for road segments
CN113763522A (en) Map rendering method, device, equipment and medium
CN110782774A (en) Crowdsourcing road data distributed processing method and device
CN112017262A (en) Pavement marker generation method and device, storage medium and electronic equipment
CN116295336A (en) Construction method, device, equipment and storage medium of map hierarchical structure
CN114705180B (en) Data correction method, device and equipment for high-precision map and storage medium
CN111238500A (en) Map generation method, device and system for road segments of road map area
CN116091716A (en) High-precision map automatic manufacturing system and method based on deep learning
CN115631476A (en) Lane data processing method, system, electronic device and storage medium
CN114136327A (en) Automatic inspection method and system for recall ratio of dotted line segment
CN111238499B (en) Road map generation method and device and related system
CN109520513B (en) Three-dimensional map drawing method and device
CN115683143A (en) High-precision navigation method and device, electronic equipment and storage medium
CN111637898B (en) Processing method and device for high-precision navigation electronic map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination