CN117387653A - Display method, device and equipment of navigation guide arrow in navigation map - Google Patents

Display method, device and equipment of navigation guide arrow in navigation map Download PDF

Info

Publication number
CN117387653A
CN117387653A CN202311319875.3A CN202311319875A CN117387653A CN 117387653 A CN117387653 A CN 117387653A CN 202311319875 A CN202311319875 A CN 202311319875A CN 117387653 A CN117387653 A CN 117387653A
Authority
CN
China
Prior art keywords
arrow
flexible
navigation
determining
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311319875.3A
Other languages
Chinese (zh)
Inventor
孙旭强
李雨石
邢生健
马巍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Langge Technology Co ltd
Zhejiang Geely Holding Group Co Ltd
Original Assignee
Hangzhou Langge Technology Co ltd
Zhejiang Geely Holding Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Langge Technology Co ltd, Zhejiang Geely Holding Group Co Ltd filed Critical Hangzhou Langge Technology Co ltd
Priority to CN202311319875.3A priority Critical patent/CN117387653A/en
Publication of CN117387653A publication Critical patent/CN117387653A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)

Abstract

The application provides a display method, a device and equipment for navigation guide arrows in a navigation map, and relates to the technical field of intelligent navigation. The method comprises the following steps: based on a preset navigation strategy, displaying a flexible navigation guiding arrow on the three-dimensional map road network shape of a target intersection in a navigation map; the flexible navigation guiding arrow is drawn and rendered after being turned over in the direction of the virtual camera by a corresponding turning angle after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, and the turning angle is determined based on camera parameter information corresponding to the target intersection. The method provides more comfortable visual navigation effect for the user and improves navigation guidance accuracy.

Description

Display method, device and equipment of navigation guide arrow in navigation map
Technical Field
The present disclosure relates to the field of intelligent navigation technologies, and in particular, to a method, an apparatus, and a device for displaying a navigation guidance arrow in a navigation map.
Background
The navigation guidance arrow is a visual direction guidance arrow provided by the navigation product in a starting and ending planning path for each position requiring the user to initiate steering action, and is used for indicating the position and the steering direction of the user requiring the steering action.
In the prior art, the elevation angle of a virtual camera is usually set to 90 degrees, a user observes a navigation map plane in a top view angle manner, and a navigation guidance arrow is presented on the map plane in a two-dimensional plane graph form. Along with the development of navigation technology, when a navigation map is presented in a three-dimensional form, the elevation angle of a camera is not limited to 90 degrees, and at this time, if a navigation guidance arrow is still presented on the map surface in a two-dimensional plane graph form, the end of the navigation guidance arrow is smaller due to the perspective principle, so that a user may not see the full view of the navigation guidance arrow, the action guidance is unclear, and the user experience is poor.
Therefore, a new method for displaying navigation guidance arrows in a navigation map is needed to enable the navigation guidance arrows to be displayed on the map surface completely, so as to provide accurate and clear navigation guidance for users.
Disclosure of Invention
The application provides a display method, a device and equipment for navigation guidance arrows in a navigation map, which are used for solving the problem of how to provide accurate and clear navigation guidance for a user.
In a first aspect, the present application provides a method for displaying a navigation guidance arrow in a navigation map, where the method includes:
based on a preset navigation strategy, displaying a flexible navigation guiding arrow on the three-dimensional map road network shape of a target intersection in a navigation map;
the flexible navigation guiding arrow is drawn and rendered after being turned over in the direction of the virtual camera by a corresponding turning angle after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, and the turning angle is determined based on camera parameter information corresponding to the target intersection.
Optionally, before displaying the flexible navigation directions arrow, the method further comprises:
acquiring road data information of the target intersection, and drawing a basic shape frame of the flexible navigation guiding arrow based on the road data information of the target intersection; the road data information comprises map road network shapes of an entering road and an exiting road of the target intersection, an arrow center line is included in a basic shape frame, the arrow center line is consistent with the road shape, at least 3 flexible nodes are included on the arrow center line, and each flexible node corresponds to 2 frame shape points;
Acquiring camera parameter information corresponding to the target intersection, determining the turnover angles of flexible nodes at the tail end positions of the arrows on the central lines of the arrows based on the camera parameter information, and determining the turnover angles of all other flexible nodes except the flexible nodes at the tail end positions of the arrows; wherein, the camera parameter information characterizes the relevant parameters of the virtual camera when navigating to the target intersection;
according to the turning angles of the flexible nodes, determining three-dimensional coordinates of the shape points of each frame on the turned basic shape frame;
and carrying out drawing rendering processing based on the three-dimensional coordinates, and determining the flexible navigation guiding arrow.
Optionally, drawing the basic shape frame of the flexible navigation guidance arrow based on the road data information of the target intersection includes:
drawing an arrow center line based on the map road network shape of the entering road and the exiting road of the target intersection;
expanding preset distances to two sides of the arrow center line by taking the arrow center line as the center, and determining original coordinates of frame-shaped points corresponding to the flexible nodes on the arrow center line;
and drawing a basic shape frame of the flexible navigation guiding arrow based on the original coordinates of the frame shape points.
Optionally, the camera parameter information includes one or more of: camera height, first distance, second distance, said elevation angle, and a preset viewing angle;
the camera height represents the height of the virtual camera relative to the map drawing, the first distance represents the distance between the virtual camera and the foot of the map drawing to the vehicle icon, the second distance represents the distance between the vehicle icon and a flexible node at the end position of the arrow on the arrow central line, and the preset observation angle represents an included angle formed by the virtual camera and the cross section at the end position of the arrow on the turned basic shape frame.
Optionally, if the preset observation angle is 90 degrees, determining, based on the camera parameter information, a flip angle of the flexible node at the position of the arrow tail end on the arrow centerline includes:
and determining the sum of the distances of the first distance and the second distance, determining the ratio of the sum of the distances to the height of the camera, and determining the overturning angle of the flexible node at the tail end position of the arrow on the central line of the arrow based on the ratio and a preset functional relation.
Optionally, determining the flip angle of each flexible node except the flexible node at the end position of the arrow includes:
determining a turnover angle range based on the turnover angle of the flexible node at the tail end position of the arrow on the arrow central line; the turning angle range is characterized in that the turning angle of the flexible node at the tail end position of the arrow is the maximum value, and the turning angle of the flexible node at the starting point position of the arrow on the central line of the arrow is the minimum value;
determining the turning angles of all flexible nodes except the flexible node at the tail end position of the arrow on the basis of the turning angle range and a preset angle determining strategy; wherein the predetermined angle determining strategy is an arithmetic distribution or a nonlinear variation.
Optionally, determining the three-dimensional coordinates of each frame shape point on the frame of the basic shape after overturning according to the overturning angle of each flexible node includes:
determining the height of each frame shape point on the turned basic shape frame relative to the map drawing based on the turning angle of each flexible node and the preset distance;
and determining the three-dimensional coordinates of each frame shape point on the turned basic shape frame based on the original coordinates of the frame shape point and the height.
Optionally, after determining the height of each frame shape point on the flipped base shape frame relative to the map surface, the method further comprises:
and determining the value of the height, and if the value of the height is smaller than 0, uniformly adjusting the height of each frame shape point on the turned basic shape frame relative to the map drawing.
In a second aspect, the present application provides a display device for a navigation guidance arrow in a navigation map, the device comprising:
the display unit is used for displaying a flexible navigation guiding arrow on the three-dimensional map road network shape of the target intersection in the navigation map based on a preset navigation strategy;
the flexible navigation guiding arrow is drawn and rendered after being turned over in the direction of the virtual camera by a corresponding turning angle after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, and the turning angle is determined based on camera parameter information corresponding to the target intersection.
In a third aspect, the present application provides an electronic device, including: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the method as described above.
In a fourth aspect, the present application provides a computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method as described above.
In a fifth aspect, the present application provides a computer program product comprising a computer program for implementing the method as described above when being executed by a processor.
The method, device and equipment for displaying navigation guide arrows in the navigation map provided by the application comprise the following steps: based on a preset navigation strategy, displaying a flexible navigation guiding arrow on the three-dimensional map road network shape of a target intersection in a navigation map; the flexible navigation guiding arrow is drawn and rendered after being turned over in the direction of the virtual camera by a corresponding turning angle after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, and the turning angle is determined based on camera parameter information corresponding to the target intersection. On the one hand, the flexible navigation guiding arrow is used in the navigation guiding, and is smoother than the rigid arrow, so that a more comfortable visual navigation effect can be provided for a user, and the user experience is improved; on the other hand, after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, the flexible navigation guiding arrow is drawn and rendered after being turned over to the direction of the virtual camera by a corresponding turning angle, the turning angle is also determined based on camera parameter information corresponding to the target intersection, and because camera parameter information corresponding to different intersections is different, the flexible navigation guiding arrow makes different shape adjustments to the arrow under different camera parameter information, so that the adaptation degree of the flexible navigation guiding arrow and each intersection is higher, the flexible navigation guiding arrow is more suitable for actual navigation scenes, the navigation guiding accuracy is higher, and the user experience is better.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic position diagram of a virtual camera provided in the present application;
FIG. 2 is an exemplary diagram of a navigational guidance arrow provided herein;
fig. 3 is a flow chart of a method for displaying navigation guidance arrows in a navigation map according to an embodiment of the present application;
FIG. 4 is an exemplary diagram of yet another navigational guidance arrow provided herein;
FIG. 5 is a flow chart of a method for generating flexible navigational guidance arrows according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a basic shape frame for generating flexible navigation directions arrows according to an embodiment of the present application;
fig. 7 is a schematic diagram of information of each parameter when determining a flip angle of a flexible node at an end position of an arrow according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a basic shape frame according to an embodiment of the present disclosure;
FIG. 9 is an exemplary diagram of determining the height of a bounding box shape point relative to a map surface provided by an embodiment of the present application;
fig. 10 is a schematic structural diagram of a display device of a navigation guidance arrow in a navigation map according to an embodiment of the present application;
Fig. 11 is a schematic structural diagram of a display device of a navigation guidance arrow in a navigation map according to another embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Specific embodiments thereof have been shown by way of example in the drawings and will herein be described in more detail. These drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but to illustrate the concepts of the present application to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The main body picture of the navigation product in the navigation process generally comprises information such as a navigation map, a planned route, a user's own car icon, an action guiding arrow and the like. Wherein the action guide arrow is also called a direction guide arrow, a navigation guide arrow, etc., which are used to indicate the position and the direction of the steering action that the user needs to make. The frame size, map range size and viewing angle of each frame of picture in the navigation process are determined by a virtual camera set by a program, the FOV (field of view) angle of the virtual camera determines the frame size, the camera height determines the map range size visible to a user, and the camera posture (pitch angle, roll angle and inclination angle) determines the viewing angle of the user. Therefore, in order to give the user clear guidance in the navigation process, the camera height is often required to be frequently adjusted in one navigation section, the map scale is enlarged and reduced, and the viewpoint center of the user is focused on the action guidance arrow at the intersection.
Exemplary, fig. 1 is a schematic position diagram of a virtual camera provided in the present application. As shown in fig. 1, the elevation angle of the virtual camera relative to the vehicle icon may be an elevation angle 1, which has a value of 90 degrees; or may be an elevation angle 2, which is less than 90 degrees. When the elevation angle is different, the viewing angle of the user is also different, and the navigation picture seen by the user is naturally different.
Current navigation products typically set the elevation angle of the virtual camera to 90 degrees so that the user views the navigation map in a top view, with the navigation directions arrows presented on the map in a two-dimensional planar graphic. When the user gets away from the steering action intersection, the camera height is increased, the map scale is reduced, and when the user gets close to the steering action intersection, the camera height is reduced, and the map scale is enlarged. In the continuous adjustment process of the camera, the length and the width of the navigation guidance arrow are adjusted simultaneously, so that the size of the navigation guidance arrow is matched with the shape of the map road network, and therefore, relatively clear, reasonable and smooth navigation guidance is provided for a user.
However, with the development of navigation technology, the building in the navigation map is not limited to be displayed in a two-dimensional form, the navigation guiding arrow is not limited to a common rigid arrow, and the building is more presented in a three-dimensional form, so that a flexible arrow is used. When the navigation map is presented in a three-dimensional form, the elevation angle of the virtual camera is not limited to 90 degrees, and at this time, if the navigation guidance arrow is still presented on the map surface in a two-dimensional plane graph form, the navigation guidance arrow will have smaller arrow tail end due to perspective principle, so that the user may not see the full view of the navigation guidance arrow, the action guidance is unclear, and the user experience is poor.
Illustratively, fig. 2 is an exemplary diagram of a navigational guidance arrow provided herein. In the navigation map shown in fig. 2, the elevation angle of the virtual camera is set to be smaller than 90 degrees, and in visual presentation, the radial plane of the navigation guidance arrow 20 framed in the figure is parallel to the road surface in the navigation map, the navigation guidance arrow bends along the radial plane of the navigation guidance arrow at the position where steering is required, and due to the perspective principle, the arrow end of the navigation guidance arrow is smaller, the full view of the navigation guidance arrow is not fully shown, and the user hardly sees the arrow end. Therefore, the direction is unclear, and the user experience is poor.
In order to solve the problems, the application provides a display method of a navigation guidance arrow in a navigation map, which converts the navigation guidance arrow from a traditional rigid arrow to a flexible arrow, correspondingly rotates sampling points on the navigation guidance arrow based on the actual elevation angle of a virtual camera, enables the flexible navigation guidance arrow to have a certain overturning angle, and controls the flexible navigation guidance arrow to be completely displayed on the map surface when the elevation angle of the virtual camera relative to a bicycle icon is smaller than 90 degrees, so that the navigation guidance arrow can be clearly observed and identified by a user, thereby providing more accurate navigation guidance for the user and improving user experience.
The following describes the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 3 is a flowchart of a method for displaying a navigation guidance arrow in a navigation map according to an embodiment of the present application. The execution main body of the embodiment of the application can be a display device of a navigation guiding arrow in a navigation map, the display device can be located on electronic equipment, the electronic equipment can be a mobile terminal such as a mobile phone, a tablet computer and a computer, and the electronic equipment can also be located on a vehicle such as an automobile.
The embodiment of the present application will be described in detail taking a display device (hereinafter simply referred to as a display device) in which an execution subject is a navigation guidance arrow in a navigation map as an example.
As shown in fig. 3, the method for displaying navigation guidance arrows in the navigation map provided in this embodiment includes:
s301, displaying a flexible navigation guiding arrow on the three-dimensional map road network shape of a target intersection in a navigation map based on a preset navigation strategy; the flexible navigation guiding arrow is drawn and rendered after being turned over in the direction of the virtual camera by a corresponding turning angle after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, and the turning angle is determined based on camera parameter information corresponding to the target intersection.
Illustratively, the application aims to show a clear and complete flexible navigation guiding arrow for a user in navigation so that the user can clearly and quickly observe and recognize the direction pointed by the arrow, and thus more accurate steering action is made. Therefore, the display device displays the flexible navigation guiding arrow on the three-dimensional map road network shape of the target intersection in the navigation map based on the preset navigation strategy. The flexible arrow is smoother than the rigid arrow, and the flexible navigation guiding arrow can provide a more comfortable visual effect for a user in navigation guiding, so that user experience is improved.
Illustratively, fig. 4 is an exemplary diagram of yet another navigational guidance arrow provided herein. As shown in fig. 4, based on the solution of the present application, the flexible navigation guidance arrow 40 of the present application is still completely displayed on the map surface when the elevation angle of the virtual camera with respect to the car icon is less than 90 degrees. Then, the user can see the direction pointed by the arrow clearly and definitely, so that more accurate steering actions can be conveniently made.
In addition, after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, the flexible navigation guiding arrow is drawn and rendered after being turned over to the direction of the virtual camera at a corresponding turning angle, and the turning angle is determined based on camera parameter information corresponding to the target intersection. Different intersections correspond to different camera parameter information, different shape adjustments are made to the arrows under the different camera parameter information, so that the adaptation degree of the flexible navigation guiding arrows and the intersections is higher, the flexible navigation guiding arrows are more suitable for actual navigation scenes, the navigation guiding accuracy is higher, and the user experience is better.
The flexible navigation guiding arrow can be pre-generated and stored in a database, and when in actual navigation, the display is called according to the navigation requirement; the display can be generated and displayed in real time according to the actual road conditions, and the display is not limited and can be arranged in a preset navigation strategy according to requirements.
Thus, in one possible example, the method of the present application may further include a process of generating the flexible navigational guidance arrow prior to displaying the flexible navigational guidance arrow. Illustratively, fig. 5 is a flow chart of a method for generating a flexible navigation guidance arrow according to an embodiment of the present application. As shown in fig. 5, the process of generating the flexible navigational guidance arrow may include:
s501, acquiring road data information of a target intersection, and generating a basic shape frame of a flexible navigation guiding arrow based on the road data information of the target intersection; the road data information comprises map road network shapes of an entering road and an exiting road of a target intersection, an arrow center line is included in a basic shape frame, the arrow center line is consistent with the road shape, at least 3 flexible nodes are included on the arrow center line, and each flexible node corresponds to 2 frame shape points.
Illustratively, the target intersection is an intersection where an action guide arrow generated by the arrow generation method of the present application needs to be displayed. The determination mode of the target intersection can be consistent with the traditional mode, the bending angle of the road, namely the steering angle required to rotate by the automobile, is calculated at any group of entering and exiting roads in the navigation map, and when the steering angle meets certain strategy conditions, the intersection is determined to be the target intersection which needs to generate the flexible navigation guiding arrow.
The method comprises the steps of obtaining road data information of a target intersection after the target intersection is determined, wherein the road data information mainly comprises a map road network shape of an entering road and a map road network shape of an exiting road of the target intersection, and then drawing a basic shape frame of a flexible navigation guiding arrow according to the map road network shape of the entering road and the map road network shape of the exiting road.
In one example, drawing the basic shape frame of the flexible navigation guidance arrow based on the road data information of the target intersection may include:
s1, drawing an arrow center line based on the shape of a map road network of an entering road and an exiting road of a target intersection.
S2, expanding preset distances to two sides of the arrow center line by taking the arrow center line as the center, and determining original coordinates of frame-shaped points corresponding to the flexible nodes on the arrow center line.
And S3, drawing a basic shape frame of the flexible navigation guiding arrow based on original coordinates of the frame shape points.
For example, an initial line, i.e., the shape of the center line of the pointing arrow, may be drawn based on the shape of the entry road and the exit road. The line has vector features, the starting point position is closer to the vehicle position, the ending point position is farther from the vehicle position and the ending point position is closer to the target position. In practical application, the arrow center line can extend towards the starting point and the ending point, and the shape is consistent with the road shape in the extending direction, so that an arrow which is more visual in pointing direction is formed. Then, the arrow face piece is generated by taking the arrow center line as the center and expanding preset distances to the two sides of the arrow center line, and the arrow face piece is the outline of the basic shape frame. The preset distance is the distance from the flexible node on the arrow center line to the frame shape point, and the width of the basic shape frame is twice the preset distance.
In specific implementation, at least 3 flexible nodes (such as a starting point, a middle point and an end point of an arrow) can be determined on a drawn arrow central line by adopting an interpolation sampling method, and of course, the distances among the flexible nodes can be adjusted according to needs to determine a plurality of flexible nodes. When the flexible nodes are more, the basic shape frames of the drawn flexible navigation guiding arrows are smoother, the finally obtained flexible navigation guiding arrows are more attractive, the natural data processing capacity is larger, and the requirements on processing equipment are higher. After the flexible nodes are determined, the preset distances are respectively extended to two sides based on the positions of the flexible nodes, frame shape points corresponding to the flexible nodes can be determined, and then a three-dimensional coordinate system is established by the original points of the points, so that the original coordinates of the frame shape points corresponding to the flexible nodes on the arrow central line can be determined. And finally, drawing a basic shape frame of the flexible navigation guiding arrow based on the original coordinates of the frame shape points.
Illustratively, fig. 6 is a schematic diagram of a basic shape frame for generating a flexible navigation guidance arrow according to an embodiment of the present application. As shown in fig. 6, in practical application, the arrow center line L may be drawn based on the map road network shape of the entering road and the exiting road of the target intersection; then, taking flexible nodes A (arrow tail end), B (arrow middle point) and C (arrow start point) on arrow central lines, and respectively based on flexible nodes A, B, C, taking an arrow central line L as a center, and expanding preset distances d to two sides of the arrow central line, so that frame shape points A1, A2, B1, B2, C1 and C2 corresponding to the flexible nodes can be respectively determined; then, respectively establishing a three-dimensional coordinate system by using the original points of each point, and determining the original coordinates of each frame shape point A1, A2, B1, B2, C1 and C2; based on the original coordinates of the frame shape points A1, A2, B1, B2, C1 and C2, the basic shape frame F of the flexible navigation guiding arrow can be drawn.
S502, acquiring camera parameter information corresponding to a target intersection, determining the turnover angles of flexible nodes at the tail end positions of the arrows on the arrow center line based on the camera parameter information, and determining the turnover angles of other flexible nodes except the flexible nodes at the tail end positions of the arrows; the camera parameter information characterizes relevant parameters of the virtual camera when navigating to the target intersection.
After drawing the basic shape frame of the flexible navigation guidance arrow, the application needs to acquire camera parameter information corresponding to the target intersection, so as to determine the turning angle of the flexible node at the end position of the arrow on the arrow centerline based on the camera parameter information, and determine the turning angles of the flexible nodes except for the flexible node at the end position of the arrow.
The camera parameter information characterizes related parameters of the virtual camera when navigating to a target intersection, the intersections are different, the corresponding camera parameter information is also different, and the turning angles of the flexible nodes at the tail end positions of the arrows determined based on the camera parameter information are also different. Illustratively, the camera parameter information may include one or more of the following: camera height, first distance, second distance, elevation angle, and preset viewing angle; the camera height represents the height of the virtual camera relative to the map drawing, the first distance represents the distance between the virtual camera and the foot of the map drawing and the vehicle icon, the second distance represents the distance between the vehicle icon and the flexible node at the arrow tail end position on the arrow central line, and the preset observation angle represents an included angle formed by the virtual camera and the cross section at the arrow tail end position on the basic shape frame after overturning.
Fig. 7 is a schematic diagram of parameter information when determining a flip angle of a flexible node at an end position of an arrow according to an embodiment of the present application. As shown in fig. 7, after determining the camera height h, the first distance x, the second distance y, the elevation angle α, and the preset observation angle β in the camera parameter information, the flip angle γ of the flexible node at the end position of the arrow can be calculated according to the functional relationship between them. The specific function calculation process is not limited in this application, and will not be described in detail herein.
The angle formed by the virtual camera and the cross section at the position of the tail end of the arrow on the turned basic shape frame is characterized by the preset observation angle, the specific numerical value of the preset observation angle is not limited, and the angle can be flexibly adjusted so as to obtain better guiding visual effect in the navigation process. For example, the preset viewing angle may be set to 90 degrees by default, and the flip angle determined based on this angle may have a better visual guiding effect after the navigation guiding arrow is generated subsequently.
In one example, if the preset observation angle is 90 degrees, determining the flip angle of the flexible node at the end position of the arrow on the arrow centerline based on the camera parameter information may include:
And determining the sum of the distances of the first distance and the second distance, determining the ratio of the sum of the distances to the height of the camera, and determining the turnover angle of the flexible node at the tail end position of the arrow on the arrow central line based on the ratio and a preset functional relation.
As an example, based on fig. 7, when the preset observation angle β is 90 degrees, it can be determined that the flip angle γ is equal to the angle 1 according to the trigonometric function relationship. Then, when determining the turning angle of the flexible node at the end position of the arrow on the arrow central line, the sum (x+y) of the distances of the first distance x and the second distance y can be directly determined, then the ratio (x+y)/h of the sum (x+y) of the distances to the camera height h is determined, and then the turning angle gamma of the flexible node at the end position of the arrow on the arrow central line can be calculated by using an inverse trigonometric function.
For example, when the preset viewing angle β is not 90 degrees, the angle 2 may be calculated according to the camera height h, the first distance x, and the second distance y, and then the flip angle γ may be calculated according to the angle 2 and the preset viewing angle β. In summary, there are various ways to calculate the flip angle γ of the flexible node at the end position of the arrow on the arrow centerline, and this application can be implemented in an optional manner. The calculated flip angle gamma of the flexible node at the end position of the arrow on the arrow centerline is less than 90 degrees.
Illustratively, after determining the flip angle of the flexible node at the arrow end position on the arrow centerline, the flip angle of each flexible node other than the flexible node at the arrow end position may then be determined. In one example, determining the flip angle of each of the other flexible nodes except the flexible node at the end of the arrow may include:
s10, determining a turnover angle range based on the turnover angle of the flexible node at the tail end position of the arrow on the arrow central line; the turning angle range is characterized in that the turning angle of the flexible node at the tail end position of the arrow is the maximum value, and the turning angle of the flexible node at the starting point position of the arrow on the central line of the arrow is the minimum value.
S20, determining the overturning angles of all flexible nodes except the flexible node at the tail end position of the arrow based on the overturning angle range and a preset angle determining strategy; the preset angle determining strategy is an arithmetic distribution or nonlinear change.
Illustratively, the present application maximizes the flip angle of the flexible node at the end position of the arrow and minimizes the flip angle of the flexible node at the beginning position of the arrow on the centerline of the arrow. The arrow starting point position is closer to the vehicle, and the user can observe the arrow starting point position completely, so that the turning angle at the flexible node for guiding the arrow starting point position can be determined to be 0 degree, namely, the turning is not performed. The flip angles of other flexible nodes between the start point and the end point of the arrow are in the flip angle range, and can be determined one by one through the arithmetic distribution or nonlinear change in the determination.
For example, if it is determined that the turning angle of the flexible node at the start position of the arrow is 0 degrees, and the calculated turning angle of the flexible node at the end position of the arrow is 60 degrees, then the turning angle of the other flexible nodes should be between 0 and 60 degrees, for example, the turning angle of the flexible node at the middle position of the arrow may be 30 degrees, and the application is not limited.
S503, according to the overturning angles of the flexible nodes, determining the three-dimensional coordinates of the shape points of each frame on the overturned basic shape frame.
For example, the basic shape frame of the flexible navigation guidance arrow drawn based on the road data information of the target intersection is located on the map plane, that is, the z value in the original coordinates of each frame shape point on the basic shape frame is 0, so that the initial guidance arrow shape points can be considered to be two-dimensional plane coordinates, and each shape point only has xy coordinates. In order to realize that the flexible navigation guide arrow is still completely displayed on the map surface when the elevation angle of the virtual camera relative to the vehicle icon is smaller than 90 degrees, three-dimensional coordinates of each frame shape point on the turned basic shape frame are required to be determined according to the turning angle of each flexible node, namely, a z value is calculated for each frame shape point on the basic shape frame, namely, the height relative to the map surface is determined, so that the subsequent adjustment is performed to enable each point to be located on the map surface.
In one example, determining the three-dimensional coordinates of each frame shape point on the frame of the inverted base shape according to the flip angle of each flexible node may include:
and S100, determining the height of each frame shape point on the frame of the basic shape after overturning relative to the map surface based on the overturning angle and the preset distance of each flexible node.
S200, determining three-dimensional coordinates of each frame shape point on the turned basic shape frame based on the original coordinates and the height of the frame shape point.
Illustratively, fig. 8 is a schematic diagram of a basic shape frame according to an embodiment of the present application. As shown in fig. 8, the basic shape frame of the flexible navigation guidance arrow is located on the map surface of the navigation map, and there are a plurality of flexible nodes such as point A, B, C on the arrow center line, and each flexible node has two corresponding frame shape points after being widened by the arrow center line, such as frame shape points A1 and A2 corresponding to the flexible node a, frame shape points B1 and B2 corresponding to the flexible node B, and frame shape points C1 and C2 corresponding to the flexible node C. The shape of the base shape border of the flexible navigational arrow is ultimately determined by these border shape points, the original coordinates of each border shape point having been determined based on the foregoing.
Fig. 9 is an exemplary diagram for determining the height of a border shape point relative to a map surface according to an embodiment of the present application. As shown in fig. 9, taking the flexible node a as an example, if the base shape frame at the flexible node a is rotated by γ degrees in the direction in which the virtual camera is located, the frame shape point A1 will be upward away from the map surface, the frame shape point A2 will be downward away from the map surface, and the height z value of the frame shape points A1 and A2 relative to the map surface can be calculated based on the distance d from a to A1 and the flip angle γ therebetween. The distance d between a and A1 is the predetermined distance extending to both sides of the arrow center line, which is the width of the basic shape frame.
Accordingly, based on the original coordinates of the frame shape points and the height z value of the frame shape points relative to the map surface, the three-dimensional coordinates of each frame shape point on the turned basic shape frame can be determined.
In some examples, since the z-value of the flipped bounding box shape point may be negative, i.e., the flipped base shape bounding box may be located below the map surface of the navigation map, in order to provide a good visual effect, in some examples, after determining the height of each bounding box shape point on the flipped base shape bounding box relative to the map surface, it may further include:
And determining the value of the height, and if the value of the height is smaller than 0, uniformly adjusting the height of each frame shape point on the turned basic shape frame relative to the map drawing.
For example, if there is a negative number in the z value of the turned frame shape point, the collision value with the largest absolute value in the values may be found, and the height of each frame shape point relative to the map plane is uniformly adjusted according to the collision value, so that each frame shape point on the turned basic shape frame is located on the map plane.
S504, drawing and rendering processing is carried out based on the three-dimensional coordinates, and a flexible navigation guiding arrow is determined.
By way of example, after three-dimensional coordinates of each frame shape point on the turned basic shape frame are determined, drawing rendering processing is directly performed based on the three-dimensional coordinates, and then the flexible navigation guiding arrow of the target intersection can be obtained. As each frame shape point on the turned basic shape frame is positioned on the map surface, the drawn and rendered flexible navigation guiding arrow is also positioned on the map surface. When the method is applied to an actual navigation scene, the rendered flexible navigation guiding arrow is shown in fig. 4, the arrow is completely displayed on the map surface, and a user can clearly and clearly see the direction pointed by the arrow.
The method for generating the flexible navigation guidance arrow provided by the embodiment of the application comprises the following steps: acquiring road data information of a target intersection, and generating a basic shape frame of a flexible navigation guiding arrow based on the road data information of the target intersection; the road data information comprises map road network shapes of an entering road and an exiting road of a target intersection, an arrow center line is included in a basic shape frame, the arrow center line is consistent with the road shape, at least 3 flexible nodes are included on the arrow center line, and each flexible node corresponds to 2 frame shape points; acquiring camera parameter information corresponding to a target intersection, determining the turning angles of flexible nodes at the tail end positions of the arrows on the arrow central line based on the camera parameter information, and determining the turning angles of all other flexible nodes except the flexible nodes at the tail end positions of the arrows; the camera parameter information characterizes related parameters of the virtual camera when navigating to the target intersection; according to the turning angles of the flexible nodes, determining three-dimensional coordinates of the shape points of each frame on the turned basic shape frame; and drawing and rendering processing is carried out based on the three-dimensional coordinates, and a flexible navigation guiding arrow is determined. According to the method, the rendered flexible navigation guidance arrow is drawn, and can be completely displayed on the map surface when the elevation angle of the virtual camera relative to the vehicle icon is smaller than 90 degrees, so that the user can clearly observe and recognize the flexible navigation guidance arrow during navigation, the navigation guidance accuracy is improved, and the user experience is also improved.
The following are device embodiments of the present application, which may be used to perform method embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments of the present application.
Fig. 10 is a schematic structural diagram of a display device of a navigation guidance arrow in a navigation map according to an embodiment of the present application. As shown in fig. 10, a display device 10 of a navigation guidance arrow in a navigation map provided in an embodiment of the present application includes a display unit 100.
The display unit 100 is configured to display a flexible navigation guidance arrow on a three-dimensional map road network shape of a target intersection in a navigation map based on a preset navigation policy.
The flexible navigation guiding arrow is drawn and rendered after being turned over in the direction of the virtual camera by a corresponding turning angle after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, and the turning angle is determined based on camera parameter information corresponding to the target intersection.
The device provided in this embodiment may be used to perform the method of the foregoing embodiment, and its implementation principle and technical effects are similar, and will not be described herein again.
Fig. 11 is a schematic structural diagram of a display device of a navigation guidance arrow in a navigation map according to another embodiment of the present application. As shown in fig. 11, the display device 11 of the navigation guidance arrow in the navigation map provided in the embodiment of the present application includes a display unit 110.
The display unit 110 is configured to display a flexible navigation guidance arrow on a three-dimensional map road network shape of a target intersection in a navigation map based on a preset navigation policy.
The flexible navigation guiding arrow is drawn and rendered after being turned over in the direction of the virtual camera by a corresponding turning angle after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, and the turning angle is determined based on camera parameter information corresponding to the target intersection.
In one example, the display device 110 further includes an arrow generating unit 120, the arrow generating unit 120 including a bezel rendering module 1201, an angle determining module 1202, a coordinate determining module 1203, and a rendering module 1204.
The frame drawing module 1201 is configured to obtain road data information of a target intersection, and draw a basic shape frame of a flexible navigation guidance arrow based on the road data information of the target intersection; the road data information comprises map road network shapes of an entering road and an exiting road of a target intersection, an arrow center line is included in a basic shape frame, the arrow center line is consistent with the road shape, at least 3 flexible nodes are included on the arrow center line, and each flexible node corresponds to 2 frame shape points.
The angle determining module 1202 is configured to obtain camera parameter information corresponding to a target intersection, determine a flip angle of a flexible node at an arrow end position on an arrow centerline based on the camera parameter information, and determine flip angles of flexible nodes other than the flexible node at the arrow end position; the camera parameter information characterizes related parameters of the virtual camera when navigating to the target intersection;
the coordinate determining module 1203 is configured to determine three-dimensional coordinates of each frame shape point on the frame of the basic shape after being turned according to the turning angle of each flexible node;
and the drawing rendering module 1204 is used for carrying out drawing rendering processing based on the three-dimensional coordinates and determining a flexible navigation guiding arrow.
In one example, the bezel rendering module 1201 includes a first rendering module 12011, a second rendering module 12012, and a third rendering module 12013.
The first drawing module 12011 is configured to draw an arrow center line based on a map road network shape of an entering road and an exiting road of the target intersection.
The second drawing module 12012 is configured to extend a preset distance to two sides of the arrow center line with the arrow center line as a center, and determine an original coordinate of a frame shape point corresponding to each flexible node on the arrow center line.
A third drawing module 12013, configured to draw a base shape frame of the flexible navigation guidance arrow based on the original coordinates of the frame shape point.
In one example, the camera parameter information includes one or more of the following: camera height, first distance, second distance, elevation angle, and preset viewing angle;
the camera height represents the height of the virtual camera relative to the map drawing, the first distance represents the distance between the virtual camera and the foot of the map drawing and the vehicle icon, the second distance represents the distance between the vehicle icon and the flexible node at the arrow tail end position on the arrow central line, and the preset observation angle represents an included angle formed by the virtual camera and the cross section at the arrow tail end position on the basic shape frame after overturning.
In one example, the angle determination module 1202 includes a first determination module 12021.
The first determining module 12021 is configured to determine a sum of distances between the first distance and the second distance, determine a ratio of the sum of distances to the camera height, and determine a flip angle of the flexible node at the end position of the arrow on the arrow centerline based on the ratio and a preset functional relation.
In one example, the angle determination module 1202 further includes a second determination module 12022 and a third determination module 12023.
A second determining module 12022, configured to determine a flip angle range based on a flip angle of the flexible node at an end position of an arrow on the arrow center line; the turning angle range is characterized in that the turning angle of the flexible node at the tail end position of the arrow is the maximum value, and the turning angle of the flexible node at the starting point position of the arrow on the central line of the arrow is the minimum value.
A third determining module 12023, configured to determine a flip angle of each flexible node except the flexible node at the end position of the arrow, based on the flip angle range and a preset angle determining policy; the preset angle determining strategy is an arithmetic distribution or nonlinear change.
In one example, the coordinate determination module 1203 includes a first calculation module 12031 and a second calculation module 12032.
The first calculation module 12031 is configured to determine, based on the flip angle and the preset distance of each flexible node, a height of each frame shape point on the flipped basic shape frame relative to the map surface.
The second calculation module 12032 is configured to determine three-dimensional coordinates of each frame shape point on the flipped basic shape frame based on the original coordinates and the height of the frame shape point.
In one example, the coordinate determination module 1203 further includes an adjustment module 12033.
The adjustment module 12033 is configured to determine the height of each frame shape point on the turned basic shape frame relative to the map surface after determining the height of each frame shape point on the turned basic shape frame, and if the determined height is smaller than 0, uniformly adjust the height of each frame shape point on the turned basic shape frame relative to the map surface.
The device provided in this embodiment may be used to perform the method of the foregoing embodiment, and its implementation principle and technical effects are similar, and will not be described herein again.
It should be noted that, it should be understood that the division of the modules of the above apparatus is merely a division of a logic function, and may be fully or partially integrated into a physical entity or may be physically separated. And these modules may all be implemented in software in the form of calls by the processing element; or can be realized in hardware; the method can also be realized in a form of calling software by a processing element, and the method can be realized in a form of hardware by a part of modules. The functions of the above data processing module may be called and executed by a processing element of the above apparatus, and may be stored in a memory of the above apparatus in the form of program codes. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. The processing element here may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 12, the electronic device 12 includes: a processor 01, and a memory 02 communicatively coupled to the processor.
Wherein the memory 02 stores computer-executable instructions; processor 01 executes computer-executable instructions stored in memory 02 to implement a method as in any of the preceding claims.
In the specific implementation of the electronic device described above, it should be understood that the processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The method disclosed in connection with the embodiments of the present application may be embodied directly in hardware processor execution or in a combination of hardware and software modules in a processor.
Embodiments of the present application also provide a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, are configured to implement a method as in any of the preceding claims.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by computer instruction related hardware. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Embodiments of the present application also provide a computer program product comprising a computer program for implementing a method as in any of the preceding claims when executed by a processor.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (11)

1. A method for displaying navigation guidance arrows in a navigation map, the method comprising:
based on a preset navigation strategy, displaying a flexible navigation guiding arrow on the three-dimensional map road network shape of a target intersection in a navigation map;
the flexible navigation guiding arrow is drawn and rendered after being turned over in the direction of the virtual camera by a corresponding turning angle after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, and the turning angle is determined based on camera parameter information corresponding to the target intersection.
2. The method of claim 1, wherein prior to displaying the flexible navigational direction arrow, the method further comprises:
acquiring road data information of the target intersection, and drawing a basic shape frame of the flexible navigation guiding arrow based on the road data information of the target intersection; the road data information comprises map road network shapes of an entering road and an exiting road of the target intersection, an arrow center line is included in a basic shape frame, the arrow center line is consistent with the road shape, at least 3 flexible nodes are included on the arrow center line, and each flexible node corresponds to 2 frame shape points;
Acquiring camera parameter information corresponding to the target intersection, determining the turnover angles of flexible nodes at the tail end positions of the arrows on the central lines of the arrows based on the camera parameter information, and determining the turnover angles of all other flexible nodes except the flexible nodes at the tail end positions of the arrows; wherein, the camera parameter information characterizes the relevant parameters of the virtual camera when navigating to the target intersection;
according to the turning angles of the flexible nodes, determining three-dimensional coordinates of the shape points of each frame on the turned basic shape frame;
and carrying out drawing rendering processing based on the three-dimensional coordinates, and determining the flexible navigation guiding arrow.
3. The method of claim 2, wherein drawing the base shape border of the flexible navigational direction arrow based on the road data information of the target intersection comprises:
drawing an arrow center line based on the map road network shape of the entering road and the exiting road of the target intersection;
expanding preset distances to two sides of the arrow center line by taking the arrow center line as the center, and determining original coordinates of frame-shaped points corresponding to the flexible nodes on the arrow center line;
And drawing a basic shape frame of the flexible navigation guiding arrow based on the original coordinates of the frame shape points.
4. The method of claim 2, wherein the camera parameter information includes one or more of: camera height, first distance, second distance, said elevation angle, and a preset viewing angle;
the camera height represents the height of the virtual camera relative to the map drawing, the first distance represents the distance between the virtual camera and the foot of the map drawing to the vehicle icon, the second distance represents the distance between the vehicle icon and a flexible node at the end position of the arrow on the arrow central line, and the preset observation angle represents an included angle formed by the virtual camera and the cross section at the end position of the arrow on the turned basic shape frame.
5. The method of claim 4, wherein determining the flip angle of the flexible node at the arrow end position on the arrow centerline based on the camera parameter information if the preset viewing angle is 90 degrees, comprises:
and determining the sum of the distances of the first distance and the second distance, determining the ratio of the sum of the distances to the height of the camera, and determining the overturning angle of the flexible node at the tail end position of the arrow on the central line of the arrow based on the ratio and a preset functional relation.
6. The method of claim 2, wherein determining the flip angle of each of the other flexible nodes except the flexible node at the end position of the arrow comprises:
determining a turnover angle range based on the turnover angle of the flexible node at the tail end position of the arrow on the arrow central line; the turning angle range is characterized in that the turning angle of the flexible node at the tail end position of the arrow is the maximum value, and the turning angle of the flexible node at the starting point position of the arrow on the central line of the arrow is the minimum value;
determining the turning angles of all flexible nodes except the flexible node at the tail end position of the arrow on the basis of the turning angle range and a preset angle determining strategy; wherein the predetermined angle determining strategy is an arithmetic distribution or a nonlinear variation.
7. A method according to claim 3, wherein determining the three-dimensional coordinates of each frame shape point on the inverted base shape frame based on the flip angle of each flexible node comprises:
determining the height of each frame shape point on the turned basic shape frame relative to the map drawing based on the turning angle of each flexible node and the preset distance;
And determining the three-dimensional coordinates of each frame shape point on the turned basic shape frame based on the original coordinates of the frame shape point and the height.
8. The method of claim 7, wherein after determining the height of each border shape point on the inverted base shape border relative to the map surface, the method further comprises:
and determining the value of the height, and if the value of the height is smaller than 0, uniformly adjusting the height of each frame shape point on the turned basic shape frame relative to the map drawing.
9. A display device for navigation directions arrows in a navigation map, the device comprising:
the display unit is used for displaying a flexible navigation guiding arrow on the three-dimensional map road network shape of the target intersection in the navigation map based on a preset navigation strategy;
the flexible navigation guiding arrow is drawn and rendered after being turned over in the direction of the virtual camera by a corresponding turning angle after the basic shape frame of the flexible navigation guiding arrow is determined based on the target intersection, and the turning angle is determined based on camera parameter information corresponding to the target intersection.
10. An electronic device, the electronic device comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the method of any one of claims 1-8.
11. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method of any one of claims 1-8.
CN202311319875.3A 2023-10-11 2023-10-11 Display method, device and equipment of navigation guide arrow in navigation map Pending CN117387653A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311319875.3A CN117387653A (en) 2023-10-11 2023-10-11 Display method, device and equipment of navigation guide arrow in navigation map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311319875.3A CN117387653A (en) 2023-10-11 2023-10-11 Display method, device and equipment of navigation guide arrow in navigation map

Publications (1)

Publication Number Publication Date
CN117387653A true CN117387653A (en) 2024-01-12

Family

ID=89467672

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311319875.3A Pending CN117387653A (en) 2023-10-11 2023-10-11 Display method, device and equipment of navigation guide arrow in navigation map

Country Status (1)

Country Link
CN (1) CN117387653A (en)

Similar Documents

Publication Publication Date Title
US7382374B2 (en) Computerized method and computer system for positioning a pointer
JP7156937B2 (en) Image processing device and image processing method
KR102222102B1 (en) An augment reality navigation system and method of route guidance of an augment reality navigation system
CN111437604A (en) Game display control method and device, electronic equipment and storage medium
JP3353581B2 (en) Bird's-eye view display navigation device
CN109407824A (en) Manikin moves synchronously method and apparatus
US10976177B2 (en) Navigation system and navigation program
CN108744515B (en) Display control method, device, equipment and storage medium for previewing map in game
CN102788590B (en) Guider
JP4533191B2 (en) 3D map display device and 3D map display program
CN117387653A (en) Display method, device and equipment of navigation guide arrow in navigation map
CN108665525B (en) Method for calculating virtual camera field angle based on engine and storage medium
CN115683152A (en) Vehicle navigation guiding method and device based on coordinate transformation and electronic equipment
JP5921753B2 (en) Map drawing device
JP4642431B2 (en) Map display device, map display system, map display method and program
US10965930B2 (en) Graphical user interface for indicating off-screen points of interest
KR20150120974A (en) stereoscopic image output system
JP2019144438A (en) Display control apparatus and display control method
JP2002183765A (en) Map display device, map display method, and computer program used in map display device
JPH1031757A (en) Graphic processor and method for calculating shortest distance between elements
JP3365313B2 (en) 3D terrain display device
CN109697747B (en) Rectangular overturning animation generation method and device
CN114518123B (en) Information processing method
CN115435806A (en) Map display method, device and equipment for travel navigation service
EP1720090B1 (en) Computerized method and computer system for positioning a pointer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination