CN112556685B - Navigation route display method and device, storage medium and electronic equipment - Google Patents

Navigation route display method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112556685B
CN112556685B CN202011419148.0A CN202011419148A CN112556685B CN 112556685 B CN112556685 B CN 112556685B CN 202011419148 A CN202011419148 A CN 202011419148A CN 112556685 B CN112556685 B CN 112556685B
Authority
CN
China
Prior art keywords
live
navigation route
action
target
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011419148.0A
Other languages
Chinese (zh)
Other versions
CN112556685A (en
Inventor
金永庆
宋孟肖
梅树起
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011419148.0A priority Critical patent/CN112556685B/en
Publication of CN112556685A publication Critical patent/CN112556685A/en
Application granted granted Critical
Publication of CN112556685B publication Critical patent/CN112556685B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions

Abstract

The invention discloses a method and a device for displaying a navigation route in an artificial intelligence scene, a storage medium and electronic equipment, and particularly relates to a high-precision map technology in environment perception. Wherein, the method comprises the following steps: acquiring a live-action navigation request triggered in a target client of a target terminal; responding to the live-action navigation request, and acquiring a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in the target map; processing the first live-action image into an image meeting the detection condition, identifying candidate straight lines in the first live-action image under the condition that the first live-action image meets the detection condition, and calculating a first guide line based on the N candidate straight lines; determining display coordinates of track points on the second navigation route; and displaying the second navigation route on the first live-action image according to the display coordinates. The invention solves the technical problem of lower display accuracy of the navigation route.

Description

Navigation route display method and device, storage medium and electronic equipment
Technical Field
The invention relates to the field of computers, in particular to a navigation route display method and device, a storage medium and electronic equipment.
Background
In recent years, applications of Augmented Reality (AR) technology are becoming more widespread, for example, when the AR technology is applied to the navigation field, an AR navigation product will provide a live-action navigation route for a user, and compared with a traditional navigation product, the AR navigation product has an obvious advantage that a picture is displayed more intuitively.
However, in the AR navigation product in the prior art, a navigation route in a standard planar map is often obtained only by using sensors such as a Global Positioning System (GPS) and a magnetometer of a mobile client (user), and then the navigation route is directly mapped to a live-action image. However, the mapping calculated by the method may have an offset, and the magnetometer may also obtain an inaccurate navigation route due to the influence of magnetic force, which all cause the navigation route in the live-action image to have an offset, so that the accuracy of the navigation route cannot be ensured, and the user experience of the user is influenced. Namely, the prior art has the technical problem that the display accuracy of the live-action navigation route is low.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a navigation route display method and device, a storage medium and electronic equipment, and at least solves the technical problem of low navigation route display accuracy.
According to an aspect of an embodiment of the present invention, there is provided a display method of a navigation route, including: acquiring a live-action navigation request triggered in a target client of a target terminal, wherein the live-action navigation request is used for requesting to display a target live-action navigation route on a live-action image acquired by the target terminal, and the target live-action navigation route is used for guiding the target terminal to move from a current position to a target position; responding to the live-action navigation request, acquiring a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in a target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from the current position to the target position; calculating a first guideline based on the N candidate lines when the N candidate lines are identified in the first live-action image, wherein a second direction indicated by the first guideline and a first direction indicated by the first navigation route satisfy a first preset relationship, and N is a positive integer; performing spatial coordinate transformation on points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is identified in the first live-action image and guides the target terminal to move from the current position along the first direction; and displaying the second navigation route on the first live-action image according to the display coordinates.
According to another aspect of the embodiments of the present invention, there is also provided a display apparatus of a navigation route, including: a first obtaining unit, configured to obtain a live-action navigation request triggered in a target client of a target terminal, where the live-action navigation request is used to request a target live-action navigation route to be displayed on a live-action image acquired by the target terminal, and the target live-action navigation route is used to guide the target terminal to move from a current position to a target position; a response unit, configured to respond to the live-action navigation request, acquire a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in a target map, where the first navigation route is a movement prompt track identified in the target map and moving from the current position to the target position; a calculating unit, configured to calculate a first guideline based on N candidate lines when the N candidate lines are identified in the first live-action image, where a second direction indicated by the first guideline and a first direction indicated by the first navigation route satisfy a first preset relationship, and N is a positive integer; a first determining unit, configured to perform spatial coordinate transformation on a point on the first guide line to determine a display coordinate of a track point on a second navigation route, where the second navigation route is a navigation prompt line that is identified in the first real-scene image and that directs the target terminal to move from the current position in the first direction; and a display unit configured to display the second navigation route on the first live view image according to the display coordinates.
According to still another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to execute the above-mentioned display method of a navigation route when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the method for displaying a navigation route through the computer program.
In the embodiment of the present invention, a live-action navigation request triggered in a target client of a target terminal is obtained, where the live-action navigation request is used to request a target live-action navigation route to be displayed on a live-action image acquired by the target terminal, and the target live-action navigation route is used to guide the target terminal to move from a current position to a target position; responding to the live-action navigation request, acquiring a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in a target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from the current position to the target position; calculating a first guideline based on the N candidate lines when the N candidate lines are identified in the first live-action image, wherein a second direction indicated by the first guideline and a first direction indicated by the first navigation route satisfy a first preset relationship, and N is a positive integer; performing spatial coordinate transformation on points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is identified in the first live-action image and guides the target terminal to move from the current position along the first direction; and displaying the second navigation route on the first live-action image according to the display coordinates, and correcting the live-action navigation route by using the first guide line as a guide line of the live-action navigation route finally displayed on the first live-action image by acquiring the first guide line meeting a first preset relation with a first direction indicated by the first navigation route and using the first guide line as the guide line of the live-action navigation route, so that the technical purpose of correcting the live-action navigation route is achieved, and the technical effect of improving the display accuracy of the live-action navigation guide line is further achieved. And then solved the lower technical problem of the demonstration accuracy of navigation route.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of an application environment of a method for displaying an alternative navigation route according to an embodiment of the invention;
FIG. 2 is a schematic illustration of a flow chart of an alternative navigation route display method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an alternative navigation route display method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an alternative navigation route display method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an alternative navigation route display method according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an alternative navigation route display method according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of an alternative navigation route display method according to an embodiment of the present invention;
FIG. 8 is a schematic view of an alternative navigation route display device according to an embodiment of the present invention;
FIG. 9 is a schematic view of an alternative navigation route display device according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an alternative electronic device according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an aspect of the embodiments of the present invention, a method for displaying a navigation route is provided, and optionally, as an optional implementation manner, the method for displaying a navigation route may be applied to, but is not limited to, an environment as shown in fig. 1. The system may include, but is not limited to, a user equipment 102, a network 110, and a server 112, wherein the user equipment 102 may include, but is not limited to, a display 108, a processor 106, and a memory 104. Optionally, the user device 102 may be, but is not limited to, having a target client installed thereon for providing a navigation route to the user and displaying the corresponding navigation route on the display 108. Optionally, the target client may include, but is not limited to, two modes for the user to select, specifically, one mode is to display the first navigation route 1024 on the target map 1022, and the other mode is to display the second navigation route 1028 on the live-action image 1026, where the target map 1022 is a plane ground, and the live-action image 1026 may be, but is not limited to, an image captured by a camera device of the user equipment 102.
The specific process comprises the following steps:
step S102, the user equipment 102 obtains a live-action navigation request triggered in the target client, where the live-action navigation request may but is not limited to carry a first navigation route 1024 and a live-action image 1026 currently acquired by the user equipment 102, for example, a virtual key "live-action navigation" set in the target client is used to indicate that the current display mode is switched to the live-action navigation mode, and optionally, after the live-action navigation request is triggered, the live-action image 1026 currently acquired by the user equipment 102 is obtained;
step S104-S106, the user equipment 102 sends a live-action navigation request to the server 112 through the network 110;
step S108, the server 112 receives the first navigation route 1024 and the live-action image 1026 carried in the live-action navigation request, and processes the first navigation route 1024 and the live-action image 1026 through the processing engine 116, so as to generate display coordinates of track points on the second navigation route 1028 on the live-action image 1026, wherein the first navigation route 1024 is used for correcting the second navigation route 1028;
steps S110-S112, the server 112 sends the second navigation route 1028 to the user device 102 via the network 110;
in step S114, the processor 106 in the user equipment 102 determines the display position of the second navigation route 1028 on the live-action image 1026 according to the display coordinates of the track point on the second navigation route 1028, so as to display the second navigation route 1028 on the display 108, and store the display coordinates of the second navigation route 1028 in the memory 104. Optionally, the first navigation route 1024 and the live-action image 1026 are processed, so that the generation of the track point on the second navigation route 1028 may also be performed in the display coordinates of the live-action image 1026, but is not limited to being performed in the user device 102 (i.e., locally), which is only for illustration and is not limited thereto.
Optionally, as an optional implementation manner, as shown in fig. 2, the display method of the navigation route includes:
s202, acquiring a live-action navigation request triggered in a target client of a target terminal, wherein the live-action navigation request is used for requesting to display a target live-action navigation route on a live-action image acquired by the target terminal, and the target live-action navigation route is used for guiding the target terminal to move from the current position to the target position;
s204, responding to the live-action navigation request, acquiring a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in the target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from the current position to the target position;
s206, under the condition that N candidate straight lines are identified in the first live-action image, calculating a first guide line based on the N candidate straight lines, wherein a second direction indicated by the first guide line and a first direction indicated by the first navigation route meet a first preset relation, and N is a positive integer;
s208, performing space coordinate conversion on the points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is marked in the first live-action image and guides the target terminal to move along the first direction from the current position;
and S210, displaying a second navigation route on the first live-action image according to the display coordinates.
Optionally, in this embodiment, the display method of the navigation route may be, but is not limited to, applied to an AR live-action navigation scene, and the live-action navigation route (i.e., the second navigation route) projected to the live-action image is corrected by obtaining a straight line in the image, which has the same or similar direction as a planned route (i.e., the first navigation route), so as to improve the display accuracy of the live-action navigation route, thereby avoiding erroneous navigation caused by deviation of the navigation route, and may be, but is not limited to, solving a technical problem that the navigation route is deviated in the real-action image due to an algorithm or hardware accuracy problem.
Optionally, in this embodiment, the target terminal may be, but is not limited to, a terminal device with an image capturing function, such as a smart phone, a car navigation device, and the like. The target client may be, but is not limited to, an application installed on the target terminal, and the application may be, but is not limited to, providing a guidance route matching the input information to a user of the target terminal, wherein the input information may include, but is not limited to, at least one of: departure point information, current position information (e.g., current position), destination position information (e.g., target position), travel mode information, and travel time information.
Optionally, in this embodiment, the target map may be, but is not limited to, a map displayed in the target client for presenting intuitive address information for the user, and the map may be, but is not limited to, an image or an image that selectively identifies several phenomena of the earth (or other stars) on a plane or a spherical surface in a two-dimensional or multi-dimensional form and means according to a certain rule. Before acquiring the live-action navigation request triggered in the target client of the target terminal, the target client may, but is not limited to, generate a first navigation route based on the input information, where the first navigation route is displayed in a target map of the target client, and a display range of the target map corresponds to the input information, for example, the display range is positively correlated to a distance from the departure point to the destination location. Optionally, the first navigation route is displayed in a target map of the target client, and in a case that a live-action navigation request triggered in the target client is detected, the navigation mode of the target client is switched from the plane map mode to the live-action map mode in response to the live-action navigation request, wherein in an application scene of the live-action map mode, the target terminal is instructed to acquire a live-action image in real time, and the second navigation route is displayed on the acquired live-action image.
By further example, optionally, for example, as shown in fig. 3(a) of fig. 3, the current target client has generated a first navigation route 304 based on input information (not shown in the figure) and displayed the first navigation route 304 in the target map 302; further, in case that a live-action navigation request is triggered in the target client (for example, the virtual button "live-action navigation" is selected), optionally, for example, as shown in fig. 3(b) in fig. 3, a live-action image 306 is displayed on the target client, and a second navigation route 308 is displayed on the live-action image 306; furthermore, in case a live-action navigation exit request is triggered in the target client (e.g. virtual button "x" is selected), an optional example is shown in fig. 3(a) of fig. 3, continuing to display the first navigation route 304 in the target map 302 of the target client. Optionally, for convenience of understanding, in the above process, the position and the direction of the target terminal (not shown in the figure) corresponding to the target client are not changed, and this is only for example and is not limited.
Optionally, in this embodiment, the live-action navigation request may, but is not limited to, carry input information, and the obtaining of the input information and the triggering of the live-action navigation request may, but is not limited to, be on different interfaces or the same interface.
Further, taking the example that the obtaining of the input information and the triggering of the live-action navigation request are on the same interface, for example, as shown in fig. 4(a), optionally, in the current interface of the target client shown in fig. 4(a), the target client obtains the input information 402 (such as a departure place, a destination, a travel mode, a travel time, and the like), and also obtains the live-action navigation request triggered by the virtual button "live-action navigation" of the target client;
furthermore, when the target client acquires the input information 402 and the live-action navigation request, the background (e.g., the server) may, but is not limited to, preferentially instruct the target terminal corresponding to the target client to acquire the live-action image 404, generate the first navigation route 410 based on the input information 402 and the target map 408 in the database, calculate the coordinate data of the second navigation route 406 according to the first navigation route 410 and the live-action image 404, and send the coordinate data to the front end (e.g., the target client). Further, the front end displays a second navigation route 406 in the live-action image 404 based on the coordinate data, for example, as shown in fig. 4(b) in fig. 4;
further, assuming that a planar navigation request triggered in the target client shown in fig. 4(b) in fig. 4 is acquired (for example, the planar navigation request is triggered by selecting the virtual button "planar navigation"), as shown in fig. 4(c) in fig. 4, the navigation mode of the target client is switched from the live-action map mode to the planar map mode, the target map 408 is displayed in the target client, and the first navigation route 410 is displayed in the target map 408. Further, assuming that a plane navigation exit request is triggered in the target client shown in fig. 4(c) in fig. 4 (e.g., by selecting a virtual button "x" to trigger the plane navigation request), the navigation mode of the target client is converted from the plane map mode back to the live-action map mode, the live-action image 404 continues to be displayed in the target client, and the second navigation route 406 is displayed in the live-action image 404, as shown in fig. 4(b) in fig. 4.
Alternatively, in this embodiment, before the first guide line is calculated based on the N candidate straight lines, the two-dimensional coordinates of the first navigation route in the target map may be, but are not limited to, converted into projection coordinates on the target coordinate system, or the two-dimensional coordinates of the pixel straight line in the first live view image may be, but are not limited to, converted into projection coordinates on the target coordinate system. Alternatively, the target coordinate system may include, but is not limited to, at least one of: rectangular coordinate system, three-dimensional coordinate system, cartesian coordinate system, cylindrical coordinate system, spherical coordinate system, etc.
Further, for example, optionally, the trajectory points on the first navigation route in the target map are subjected to space coordinate transformation to determine projection coordinates of the trajectory points on the projection navigation straight line in the cartesian coordinate system, and the pixel (trajectory) points on the N candidate straight lines in the first live-action image are subjected to space coordinate transformation to determine projection coordinates of the trajectory points on the projection navigation straight line in the cartesian coordinate system. Further, when coordinate data corresponding to the first navigation route and the N candidate straight lines in the cartesian coordinate system are acquired, target data of the first guide line in the cartesian coordinate system is calculated based on the coordinate data.
Optionally, in this embodiment, the spatial coordinate conversion is performed on the point on the first guide line to determine the display coordinate of the point on the first guide line in the first live-action image, and the display coordinate of the track point on the second navigation route is determined by the display coordinate of the first guide line in the first live-action image. For example, on the basis of the display coordinates of the first guide line in the first live-action image, the display coordinates of the track point on the second navigation route are determined by combining the center line position of the live-action image, so that the display of the second navigation route is accurate and ornamental.
In addition, in this embodiment, the first navigation route may be mapped to the live-action image by the target client according to the camera pose calculated from the continuous images, and then the first navigation route mapped to the live-action image is corrected by using the display coordinate of the first guide line in the first live-action image, so as to determine the display coordinate of the track point on the second navigation route.
It should be noted that a live-action navigation request triggered in a target client of a target terminal is obtained, where the live-action navigation request is used to request a target live-action navigation route to be displayed on a live-action image acquired by the target terminal, and the target live-action navigation route is used to guide the target terminal to move from a current position to a target position; responding to the live-action navigation request, acquiring a first live-action image currently acquired by a target terminal and a first navigation route generated for the target terminal in a target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from a current position to a target position; processing the first live-action image into an image meeting a detection condition, identifying a candidate straight line in the first live-action image under the condition that the first live-action image meets the detection condition, and calculating a first guide line based on N candidate straight lines under the condition that the N candidate straight lines are identified in the first live-action image, wherein a second direction indicated by the first guide line and a first direction indicated by the first navigation route meet a first preset relation, and N is a positive integer; performing space coordinate conversion on points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is marked in the first live-action image and used for guiding the target terminal to move along the first direction from the current position; and displaying the second navigation route on the first live-action image according to the display coordinates.
For further example, it is optionally assumed that an execution flow of the display method of the navigation route is shown in fig. 5, and the specific steps are as follows:
step S502, the target terminal collects images in real time to obtain an original live-action image 502 (for example, a first live-action image);
step S504, performing preliminary processing on the original live-action image 502 to obtain a target live-action image 504 meeting detection conditions, wherein the detection conditions can be but are not limited to that the noise point of the image is less than or equal to a preset threshold value, and the preliminary processing can be but is not limited to denoising processing;
step S506, performing a line detection operation on the target live-action image 504 to identify a plurality of candidate lines 506;
optionally, in the execution process of the steps S502-S506, the coordinate (pixel) points of the original real-scene image 502, the target real-scene image 504, or the candidate straight line 506 are all located in the first coordinate system, or the spatial dimensions of the original real-scene image 502, the target real-scene image 504, or the candidate straight line 506 are consistent;
step S508, performing spatial transformation on the candidate straight line 506 to determine coordinate data of the candidate straight line 506 on the second coordinate system;
step S510, performing line filtering on the candidate straight line 506 on the second coordinate system to filter out candidate straight lines 506 having an included angle with the traveling direction (e.g., the first direction indicated by the first navigation route) greater than or equal to the target included angle, or to say, determine candidate straight lines 506 having an included angle with the traveling direction smaller than the target included angle;
optionally, in the execution process of the above steps S508 to S510, the coordinate point of the candidate straight line 506 is located in the second coordinate system;
step S512, performing space conversion on the candidate straight line 506 with the included angle between the candidate straight line and the advancing direction smaller than the target included angle to determine the coordinate data of the candidate straight line 506 on the first coordinate system;
step S514, determining coordinate data of the second navigation route 510 on the first coordinate system based on the candidate straight line 506 on the first coordinate system;
step S516, displaying a second navigation route 510 on the original live-action image 502;
optionally, in the execution process of the above steps S512-S516, the coordinate points of the candidate straight line 506 and the second navigation route 510 are located in the first coordinate system.
Optionally, in the execution process of the step S510, the spatial transformation may be performed, but is not limited to, for example, first performing linear filtering on the candidate straight lines 506 in the second coordinate system, then performing the spatial transformation to determine coordinate data of the candidate straight lines 506 in a third coordinate system (not shown in the figure), and then performing second linear filtering on the candidate straight lines 506 in the third coordinate system, where the first linear filtering may be, but is not limited to, filtering out candidate straight lines 506 that are greatly different from the traveling direction, and the second linear filtering may be, but is not limited to, using calculation formulas such as an angle average, a variance, and the like, to screen out more precise candidate straight lines 506 that are consistent with the traveling direction.
Optionally, in this embodiment, since the cyclic data needs to solve the problem of data jump, for example, it is specified that the east direction is 0 degrees, the north direction is 90 degrees, and the south direction is-90 degrees, the west direction may be 180 degrees or-180 degrees, and when the traveling direction is near the west direction, a string of values near 180 degrees or-180 degrees may occur due to errors of the algorithm and the sensor, and a complicated judgment needs to be introduced to avoid such jump values. If 0-360 degrees are specified for one turn from east counterclockwise, the jump value between 0 and 360 degrees needs to be considered. The above method is not non-calculable, but is too complex, and especially when the straight line filtering needs to calculate the angle mean value, variance and the like, many complex situations need to be considered. Further, the second coordinate system is assumed to be an angle coordinate system, the third coordinate system is a cartesian coordinate system, and the included angle is assumed to be x, that is, each function is converted into (cos (x), sin (x)) by using a trigonometric function, the average value of each component of all angles in the third coordinate system is calculated, and finally, the average angle is obtained, so that the calculation amount of the straight line filtering is greatly shortened, and the efficiency of the straight line filtering is improved.
According to the embodiment provided by the application, a live-action navigation request triggered in a target client of a target terminal is obtained, wherein the live-action navigation request is used for requesting a target live-action navigation route to be displayed on a live-action image acquired by the target terminal, and the target live-action navigation route is used for guiding the target terminal to move from the current position to the target position; responding to the live-action navigation request, acquiring a first live-action image currently acquired by a target terminal and a first navigation route generated for the target terminal in a target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from a current position to a target position; under the condition that N candidate straight lines are identified in the first live-action image, calculating a first guide line based on the N candidate straight lines, wherein a second direction indicated by the first guide line and a first direction indicated by the first navigation route meet a first preset relation, and N is a positive integer; performing space coordinate conversion on points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is marked in the first live-action image and used for guiding the target terminal to move along the first direction from the current position; and displaying a second navigation route on the first live-action image according to the display coordinates, and correcting the live-action navigation route by using a mode of using the first guide line as a guide line of the live-action navigation route finally displayed on the first live-action image by acquiring the first guide line meeting a first preset relation with the first direction indicated by the first navigation route, so that the technical purpose of correcting the live-action navigation route is achieved, and the technical effect of improving the display accuracy of the live-action navigation guide line is further realized.
As an alternative, the calculating the first guideline based on the N candidate straight lines includes:
s1, performing space coordinate transformation on the points on the N candidate straight lines to determine projection coordinates of track points on N first projection straight lines in the three-dimensional coordinate system;
s2, performing space coordinate transformation on the points on the first navigation route to determine the projection coordinates of the track points on the first projection navigation route in the three-dimensional coordinate system;
s3, a first guideline is calculated based on the first projected navigation route.
Alternatively, in this embodiment, the spatial coordinate transformation may be accomplished based on, but not limited to, camera parameters of the target terminal, where the camera parameters may be divided into, but not limited to, camera internal parameters and camera external parameters, where the transformation relation description of the camera coordinate system projected to the image plane may be understood as mathematical expression of the camera imaging rule, which is related to the camera hardware itself, and generally a transformation matrix is composed of focal lengths fx, fy and central focal points cx, cy, and since the camera coordinate system uses units in mm and the image plane (e.g. the first live-action image) uses pixels as units, the internal parameters serve to change linearly between the two coordinate systems. The description of the transformation relationship from the camera coordinate system to the world coordinate system (for example, a three-dimensional coordinate system) can be understood as the description of the pose of the camera in the world coordinate system, which is related to the installation position and angle of the camera and generally represented by a transformation matrix T and a rotation matrix R, wherein the transformation matrix is the relative relationship of the positions, and the rotation matrix is the relative relationship of the postures, so that the pose can be completely expressed together, namely the camera external reference. Alternatively, the camera external parameters and the camera internal parameters may be combined to derive a transformation matrix from (pixel) points on the N candidate straight lines to coordinate points of the three-dimensional coordinate system, but not limited thereto.
It should be noted that, the navigation route (the first projection navigation route) is issued by the server according to the orientation and position of the target client requesting data, so the orientation of the navigation route is the real orientation in the three-dimensional space (the three-dimensional coordinate system), only the coordinate data are different, and the straight line (the candidate straight line) detected on the live-action image cannot acquire the real direction thereof, so the straight line detected in the live-action image needs to be projected into the three-dimensional space, specifically, the spatial coordinate conversion is performed on the points on the N candidate straight lines to determine the projection coordinates of the track points on the N first projection straight lines in the three-dimensional coordinate system; performing space coordinate conversion on points on the first navigation route to determine projection coordinates of track points on the first projection navigation route in a three-dimensional coordinate system; based on the first projected navigation route, a first guideline is calculated.
According to the embodiment provided by the application, the space coordinate conversion is carried out on the points on the N candidate straight lines so as to determine the projection coordinates of the track points on the N first projection straight lines in the three-dimensional coordinate system; performing space coordinate conversion on points on the first navigation route to determine projection coordinates of track points on the first projection navigation route in a three-dimensional coordinate system; based on the first projection navigation route, the first guide line is calculated, the purpose of displaying the candidate straight line and the first projection navigation route in the same coordinate system is achieved, and the effect of improving the compatibility of the candidate straight line and the first projection navigation route is achieved.
As an optional scheme, performing spatial coordinate transformation on points on N candidate straight lines to determine projection coordinates of trajectory points on N first projection straight lines in a three-dimensional coordinate system, includes:
s1, acquiring first coordinate data of points on the N candidate straight lines on a two-dimensional coordinate system corresponding to the first live-action image, wherein the first coordinate data correspond to camera internal parameters of the target terminal;
s2, calculating a conversion matrix of the two-dimensional coordinate system and the three-dimensional coordinate system, wherein the conversion matrix corresponds to the camera external parameters of the target terminal;
and S3, converting the first coordinate data into second coordinate data based on the conversion matrix, wherein the second coordinate data are projection coordinate data of the track points on the N first projection straight lines in a three-dimensional coordinate system.
It should be noted that first coordinate data of points on the N candidate straight lines on a two-dimensional coordinate system corresponding to the first live-action image are acquired, wherein the first coordinate data correspond to camera internal parameters of the target terminal; calculating a conversion matrix of the two-dimensional coordinate system and the three-dimensional coordinate system, wherein the conversion matrix corresponds to the camera external parameters of the target terminal; and converting the first coordinate data into second coordinate data based on the conversion matrix, wherein the second coordinate data is projection coordinate data of track points on the N first projection straight lines in a three-dimensional coordinate system.
Further, for example, the optional camera internal reference of the target terminal is a 3X3 matrix K, the current rotation four element is q, the homogeneous coordinates of the two end points of the candidate straight line are P1(X, Y, 1), and P2(X, Y, 1), and with reference to the following formula (1), the coordinates of the two end points of the straight line in the three-dimensional space are P (X, Y, Z) when P1 and P2 are substituted, so as to calculate the slope of the straight line formed by the two end points of P1 and P2.
P=q*K-1*p (1);
Further optionally, in this embodiment, the candidate straight line direction with higher confidence is obtained, and the coordinates of the candidate straight line in the camera model are obtained by substituting P1 and P2 into the spatial point Pw in the following formula (2), respectively. Where RIC and TIC are the rotation and translation matrices of the transformation between the camera and IMU, and matrix R is the rotation matrix of the current camera.
Pc=RICT*(R-1*Pw–TIC) (2);
The following formula (3) and formula (4) are referred again to obtain coordinates (u, v) in the live view image. Wherein fx and fy in formula (3) and formula (4) are focal lengths in x and y directions, respectively, and cx and cy are translations relative to an origin, respectively.
u=fx*X/Z+cx (3);
v=fy*Y/Z+cy (4);
According to the embodiment provided by the application, first coordinate data of points on N candidate straight lines on a two-dimensional coordinate system corresponding to a first live-action image are obtained, wherein the first coordinate data correspond to camera internal parameters of a target terminal; calculating a conversion matrix of the two-dimensional coordinate system and the three-dimensional coordinate system, wherein the conversion matrix corresponds to the camera external parameters of the target terminal; based on the transformation matrix, the first coordinate data are transformed into second coordinate data, wherein the second coordinate data are projection coordinate data of the track points on the N first projection straight lines in the three-dimensional coordinate system, the purpose of processing coordinate data of different latitudes into coordinate data of the same latitude by using the transformation matrix is achieved, and the effect of improving the uniformity of the data to be processed is achieved.
As an alternative, calculating a first guideline based on the first projected navigation route includes:
s1, M candidate projection straight lines are obtained, wherein N first projection straight lines comprise the M candidate projection straight lines, and included angles between the candidate projection straight lines and the first projection navigation route meet a second preset relation;
s2, a first guideline is calculated based on the first projected navigation path and the M candidate projected straight lines.
It should be noted that, because many candidate straight lines are not in the traveling direction (for example, the direction indicated by the first projected navigation route), a certain filtering strategy is required to remove these candidate straight lines that may affect the final direction of the real-scene navigation route, and specifically, among all candidate straight lines, a candidate straight line that satisfies the second preset relationship with the first projected navigation route is screened out; and calculating a first guide line based on the first projection navigation route and the M candidate projection straight lines.
For further example, alternatively, as shown in fig. 6, assuming that the formed sectors defining the traveling direction 602 30 degrees to the left and 30 degrees to the right are confidence areas, filtering out the projection line candidates corresponding to the straight directions outside the confidence areas, and determining the projection line candidates corresponding to the straight directions inside the confidence areas as the projection line candidates. For example, if the direction of the first straight line 604 is within the confidence region, the first straight line 604 is determined as a candidate projection straight line, and if the direction of the second straight line 606 is within the confidence region, the second straight line 606 is filtered.
According to the embodiment provided by the application, M candidate projection straight lines are obtained, wherein N first projection straight lines comprise the M candidate projection straight lines, and the included angle between the candidate projection straight lines and the first projection navigation route meets a second preset relation; based on the first projection navigation route and the M candidate projection straight lines, the first guide line is calculated, the purpose of filtering the candidate projection straight lines which are not in the advancing direction is achieved, the calculated amount of the guide line is reduced, and the effect of improving the calculation efficiency of the guide line is achieved.
As an alternative, the calculating the first guideline based on the first projected navigation route and M candidate projected straight lines includes:
s1, performing space coordinate transformation on the track points on the M candidate projection straight lines to determine projection coordinates of the track points on M second projection straight lines in a Cartesian coordinate system;
s2, performing space coordinate transformation on the track points on the first projection navigation route to determine the projection coordinates of the track points on the second projection navigation route in a Cartesian coordinate system;
s3, obtaining M target included angles, wherein the target included angles are included angles between the second projection straight line and the second projection navigation route;
s4, calculating a first guide line based on the M target included angles.
The method includes the steps that space coordinate transformation is conducted on track points on M candidate projection straight lines to determine projection coordinates of the track points on M second projection straight lines in a Cartesian coordinate system; performing space coordinate conversion on track points on the first projection navigation route to determine projection coordinates of the track points on the second projection navigation route in a Cartesian coordinate system; acquiring M target included angles, wherein the target included angles are included angles between the second projection straight line and the second projection navigation route; based on the M target included angles, a first guide line is calculated.
For further example, optionally, for example, the M candidate projection straight lines and the first projection navigation route are converted into a cartesian coordinate system to obtain a second projection straight line and a second projection navigation route, and then a least mean square error filtering method is used to obtain more accurate coordinate data. The specific steps are as follows, firstly, an average value xave of included angles between all the second projection straight lines and the second projection navigation route is calculated, and then, the variance S2 is calculated through the average value xave. Further, refer to the following formula (5), where S is a standard deviation, N is a second projection straight line, and xi is an included angle between the ith second projection straight line and the second projection navigation route. And then deleting the second projection straight lines of which the difference between the included angle between the second projection straight lines and the second projection navigation route and the average value xave is greater than the variance S2 to obtain candidate projection straight lines yi with higher confidence coefficient. Finally, the average value xave is calculated again using the high-confidence candidate projection straight lines yi, where M is the number of high-confidence candidate projection straight lines yi and yave is the straight line orientation of the first guideline, with reference to the following formula (6).
Formula (5)
Figure BDA0002821495260000171
Formula (6)
Figure BDA0002821495260000172
According to the embodiment provided by the application, the spatial coordinate transformation is carried out on the track points on the M candidate projection straight lines, so that the projection coordinates of the track points on the M second projection straight lines in a Cartesian coordinate system are determined; performing space coordinate conversion on track points on the first projection navigation route to determine projection coordinates of the track points on the second projection navigation route in a Cartesian coordinate system; acquiring M target included angles, wherein the target included angles are included angles between the second projection straight line and the second projection navigation route; based on the M target included angles, the first guide line is calculated, the purpose of correcting the live-action navigation route by utilizing high-precision guide is achieved, and the effect of improving the display accuracy of the live-action navigation route is achieved.
As an alternative, after the spatial coordinate transformation of the points on the first guideline to determine the display coordinates of the trajectory points on the second navigation route, the method comprises:
s1, acquiring a second live-action image acquired by the target terminal, wherein the acquisition time of the second live-action image is later than that of the first live-action image;
s2, in a case where K candidate straight lines are recognized in the second live view image, calculating a second guideline based on the K candidate straight lines;
s3, performing space coordinate transformation on the points on the second guide line to update the display coordinates of the track points on the second navigation route;
s4, the second navigation route is displayed on the second live view image in accordance with the updated display coordinates.
Optionally, in this embodiment, the target terminal may further but not limited to display the second navigation route on the first real-scene image according to the display coordinates after the target terminal acquires the second real-scene image, and display the updated second navigation route on the second real-scene image while the target client updates the first real-scene image to the second real-scene image.
It should be noted that, when the target client is in the live-action navigation mode, the target terminal acquires an image in real time to allow the target client to display a live-action navigation route on the acquired image, specifically, a second live-action image acquired by the target terminal is acquired, where the acquisition time of the second live-action image is later than the acquisition time of the first live-action image; calculating a second guideline based on the K candidate lines in a case where the K candidate lines are identified in the second live-action image; performing space coordinate conversion on the points on the second guide line to update the display coordinates of the track points on the second navigation route; and displaying the second navigation route on the second live-action image according to the updated display coordinates.
By further example, alternatively, for example, as shown in fig. 7(a) in fig. 7, it is assumed that the target client is currently in the live-action navigation mode, and a first live-action image 702 acquired at the current time is displayed on the target client, and a second navigation route 704 is displayed on the first live-action image 702. Further, assuming that the live-action image acquired at the next time instant of the current time instant is as shown in the second live-action image 706 of fig. 7(b) in fig. 7, the navigation route is updated, and the updated second navigation route 708 is displayed on the second live-action image 706.
According to the embodiment provided by the application, a second real image acquired by a target terminal is acquired, wherein the acquisition time of the second real image is later than that of the first real image; calculating a second guideline based on the K candidate lines in a case where the K candidate lines are identified in the second live-action image; performing space coordinate conversion on the points on the second guide line to update the display coordinates of the track points on the second navigation route; and displaying the second navigation route on the second live-action image according to the updated display coordinates, so that the purpose of acquiring the live-action image in real time for displaying the live-action navigation route is achieved, and the real-time display effect of the live-action navigation route is realized.
As an optional scheme, before the calculating the first guideline based on the N candidate straight lines, the method further includes:
s1, performing image filtering operation on the acquired first live-action image, wherein the image filtering operation is used for removing noise of the first live-action image;
s2, in case that the first live view image after the image filtering operation satisfies the detection condition, performing a straight line detection operation on the first live view image, wherein the straight line detection operation is used to identify a candidate straight line.
Optionally, in this embodiment, the image filtering operation may be, but is not limited to, smoothing the image to remove noise, such as gaussian filtering operation, linear filtering, mean filtering, and the like. The straight line Detection operation can be used for identifying and extracting straight lines in the image, such as deep learning methods like Ultra-Fast-Lane-Detection, edge extraction using canny operator, Hough transform, and the like.
It should be noted that, an image filtering operation is performed on the acquired first live-action image, where the image filtering operation is to remove noise of the first live-action image; and in the case that the first real image after the image filtering operation is performed meets the detection condition, performing a straight line detection operation on the first real image, wherein the straight line detection operation is used for identifying candidate straight lines.
For further example, optionally, for example, an original image is obtained first, then gaussian filtering is performed on the image, and noise is removed from the smoothed image; then extracting edges by using a canny operator; and then carrying out straight line detection on the image by using Hough transform to obtain straight line original data with noise. Alternatively, the Canny operator can be, but is not limited to, a multi-level edge detection algorithm developed by fohn.j. The hough transform can be, but is not limited to, a feature extraction, which is widely used in image analysis, computer vision, and image processing to find features.
According to the embodiment provided by the application, the image filtering operation is executed on the acquired first live-action image, wherein the image filtering operation is used for removing noise points of the first live-action image; under the condition that the first live-action image after the image filtering operation is executed meets the detection condition, the straight line detection operation is executed on the first live-action image, wherein the straight line detection operation is used for identifying candidate straight lines, the purpose of performing preliminary processing on the acquired original image is achieved, and the effect of improving the processing efficiency of the live-action image is achieved.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
According to another aspect of the embodiments of the present invention, there is also provided a display apparatus of a navigation route for implementing the display method of a navigation route described above. As shown in fig. 8, the apparatus includes:
a first obtaining unit 802, configured to obtain a live-action navigation request triggered in a target client of a target terminal, where the live-action navigation request is used to request a target live-action navigation route to be displayed on a live-action image acquired by the target terminal, and the target live-action navigation route is used to guide the target terminal to move from a current location to a target location;
a response unit 804, configured to respond to the live-action navigation request, obtain a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in the target map, where the first navigation route is a movement prompt track that is identified in the target map and moves from a current position to the target position;
a calculating unit 806, configured to calculate a first guideline based on the N candidate lines when the N candidate lines are identified in the first live-action image, where a second direction indicated by the first guideline and a first direction indicated by the first navigation route satisfy a first preset relationship, and N is a positive integer;
a first determining unit 808, configured to perform spatial coordinate transformation on a point on the first guide line to determine a display coordinate of a track point on a second navigation route, where the second navigation route is a navigation prompt line that guides the target terminal to move from the current position along the first direction and is identified in the first live-action image;
and a display unit 810 for displaying the second navigation route on the first live view image according to the display coordinates.
Optionally, in this embodiment, the display device of the navigation route may be, but is not limited to, applied to an AR live-action navigation scene, and corrects the live-action navigation route (i.e., the second navigation route) projected to the live-action image by acquiring a straight line in the image, which has the same or similar direction as a planned route (i.e., the first navigation route), so as to improve the display accuracy of the live-action navigation route, thereby avoiding an erroneous navigation caused by an offset of the navigation route, and may be, but is not limited to, solving a technical problem that the navigation route is offset in the projection to the live-action image due to an algorithm or hardware accuracy problem.
Optionally, in this embodiment, the target terminal may be, but is not limited to, a terminal device with an image capturing function, such as a smart phone, a car navigation device, and the like. The target client may be, but is not limited to, an application installed on the target terminal, and the application may be, but is not limited to, providing a guidance route matching the input information to a user of the target terminal, wherein the input information may include, but is not limited to, at least one of: departure point information, current position information (e.g., current position), destination position information (e.g., target position), travel mode information, and travel time information.
Optionally, in this embodiment, the target map may be, but is not limited to, a map displayed in the target client for presenting intuitive address information for the user, and the map may be, but is not limited to, an image or an image that selectively identifies several phenomena of the earth (or other stars) on a plane or a spherical surface in a two-dimensional or multi-dimensional form and means according to a certain rule. Before acquiring the live-action navigation request triggered in the target client of the target terminal, the target client may, but is not limited to, generate a first navigation route based on the input information, where the first navigation route is displayed in a target map of the target client, and a display range of the target map corresponds to the input information, for example, the display range is positively correlated to a distance from the departure point to the destination location. Optionally, the first navigation route is displayed in a target map of the target client, and in a case that a live-action navigation request triggered in the target client is detected, the navigation mode of the target client is switched from the plane map mode to the live-action map mode in response to the live-action navigation request, wherein in an application scene of the live-action map mode, the target terminal is instructed to acquire a live-action image in real time, and the second navigation route is displayed on the acquired live-action image.
Optionally, in this embodiment, the live-action navigation request may, but is not limited to, carry input information, and the obtaining of the input information and the triggering of the live-action navigation request may, but is not limited to, be on different interfaces or the same interface.
Alternatively, in this embodiment, before the first guide line is calculated based on the N candidate straight lines, the two-dimensional coordinates of the first navigation route in the target map may be, but are not limited to, converted into projection coordinates on the target coordinate system, or the two-dimensional coordinates of the pixel straight line in the first live view image may be, but are not limited to, converted into projection coordinates on the target coordinate system. Alternatively, the target coordinate system may include, but is not limited to, at least one of: rectangular coordinate system, three-dimensional coordinate system, cartesian coordinate system, cylindrical coordinate system, spherical coordinate system, etc.
Further, for example, optionally, the trajectory points on the first navigation route in the target map are subjected to space coordinate transformation to determine projection coordinates of the trajectory points on the projection navigation straight line in the cartesian coordinate system, and the pixel (trajectory) points on the N candidate straight lines in the first live-action image are subjected to space coordinate transformation to determine projection coordinates of the trajectory points on the projection navigation straight line in the cartesian coordinate system. Further, when coordinate data corresponding to the first navigation route and the N candidate straight lines in the cartesian coordinate system are acquired, target data of the first guide line in the cartesian coordinate system is calculated based on the coordinate data.
Optionally, in this embodiment, the spatial coordinate conversion is performed on the point on the first guide line to determine the display coordinate of the point on the first guide line in the first live-action image, and the display coordinate of the track point on the second navigation route is determined by the display coordinate of the first guide line in the first live-action image. For example, on the basis of the display coordinates of the first guide line in the first live-action image, the display coordinates of the track point on the second navigation route are determined by combining the center line position of the live-action image, so that the display of the second navigation route is accurate and ornamental.
In addition, in this embodiment, the first navigation route may be mapped to the live-action image by the target client according to the camera pose calculated from the continuous images, and then the first navigation route mapped to the live-action image is corrected by using the display coordinate of the first guide line in the first live-action image, so as to determine the display coordinate of the track point on the second navigation route.
It should be noted that a live-action navigation request triggered in a target client of a target terminal is obtained, where the live-action navigation request is used to request a target live-action navigation route to be displayed on a live-action image acquired by the target terminal, and the target live-action navigation route is used to guide the target terminal to move from a current position to a target position; responding to the live-action navigation request, acquiring a first live-action image currently acquired by a target terminal and a first navigation route generated for the target terminal in a target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from a current position to a target position; processing the first live-action image into an image meeting a detection condition, identifying a candidate straight line in the first live-action image under the condition that the first live-action image meets the detection condition, and calculating a first guide line based on N candidate straight lines under the condition that the N candidate straight lines are identified in the first live-action image, wherein a second direction indicated by the first guide line and a first direction indicated by the first navigation route meet a first preset relation, and N is a positive integer; performing space coordinate conversion on points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is marked in the first live-action image and used for guiding the target terminal to move along the first direction from the current position; and displaying the second navigation route on the first live-action image according to the display coordinates.
Optionally, in this embodiment, since the cyclic data needs to solve the problem of data jump, for example, it is specified that the east direction is 0 degrees, the north direction is 90 degrees, and the south direction is-90 degrees, the west direction may be 180 degrees or-180 degrees, and when the traveling direction is near the west direction, a string of values near 180 degrees or-180 degrees may occur due to errors of the algorithm and the sensor, and a complicated judgment needs to be introduced to avoid such jump values. If 0-360 degrees are specified for one turn from east counterclockwise, the jump value between 0 and 360 degrees needs to be considered. The above-mentioned device is not non-calculable but rather too complex, especially when the straight line filtering requires the calculation of the angle mean, variance, etc., many complications need to be taken into account. Further, the second coordinate system is assumed to be an angle coordinate system, the third coordinate system is a cartesian coordinate system, and the included angle is assumed to be x, that is, each function is converted into (cos (x), sin (x)) by using a trigonometric function, the average value of each component of all angles in the third coordinate system is calculated, and finally, the average angle is obtained, so that the calculation amount of the straight line filtering is greatly shortened, and the efficiency of the straight line filtering is improved.
For a specific embodiment, reference may be made to the example shown in the display device of the navigation route, which is not described herein again in this example.
According to the embodiment provided by the application, a live-action navigation request triggered in a target client of a target terminal is obtained, wherein the live-action navigation request is used for requesting a target live-action navigation route to be displayed on a live-action image acquired by the target terminal, and the target live-action navigation route is used for guiding the target terminal to move from the current position to the target position; responding to the live-action navigation request, acquiring a first live-action image currently acquired by a target terminal and a first navigation route generated for the target terminal in a target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from a current position to a target position; under the condition that N candidate straight lines are identified in the first live-action image, calculating a first guide line based on the N candidate straight lines, wherein a second direction indicated by the first guide line and a first direction indicated by the first navigation route meet a first preset relation, and N is a positive integer; performing space coordinate conversion on points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is marked in the first live-action image and used for guiding the target terminal to move along the first direction from the current position; and displaying a second navigation route on the first live-action image according to the display coordinates, and correcting the live-action navigation route by using a mode of using the first guide line as a guide line of the live-action navigation route finally displayed on the first live-action image by acquiring the first guide line meeting a first preset relation with the first direction indicated by the first navigation route, so that the technical purpose of correcting the live-action navigation route is achieved, and the technical effect of improving the display accuracy of the live-action navigation guide line is further realized.
As an alternative, the computing unit 806 includes:
the first determination module is used for performing space coordinate conversion on points on the N candidate straight lines so as to determine projection coordinates of track points on the N first projection straight lines in the three-dimensional coordinate system;
the second determination module is used for carrying out space coordinate conversion on the points on the first navigation route so as to determine the projection coordinates of the track points on the first projection navigation route in the three-dimensional coordinate system;
and the calculating module is used for calculating a first guide line based on the first projection navigation route.
For a specific embodiment, reference may be made to the example shown in the display method of the navigation route, which is not described herein again in this example.
As an optional solution, the first determining module includes:
the first acquisition submodule is used for acquiring first coordinate data of points on the N candidate straight lines on a two-dimensional coordinate system corresponding to the first live-action image, wherein the first coordinate data correspond to camera internal parameters of the target terminal;
the calculation submodule is used for calculating a conversion matrix of a two-dimensional coordinate system and a three-dimensional coordinate system, wherein the conversion matrix corresponds to the camera external parameters of the target terminal;
and the conversion submodule is used for converting the first coordinate data into second coordinate data based on the conversion matrix, wherein the second coordinate data is projection coordinate data of track points on the N first projection straight lines in a three-dimensional coordinate system.
For a specific embodiment, reference may be made to the example shown in the display method of the navigation route, which is not described herein again in this example.
As an alternative, the calculation module includes:
the second obtaining submodule is used for obtaining M candidate projection straight lines, wherein the N first projection straight lines comprise the M candidate projection straight lines, and an included angle between the candidate projection straight lines and the first projection navigation route meets a second preset relation;
and the second calculation submodule is used for calculating a first guide line based on the first projection navigation route and the M candidate projection straight lines.
For a specific embodiment, reference may be made to the example shown in the display method of the navigation route, which is not described herein again in this example.
As an alternative, the second computation submodule includes:
the first determining subunit is used for performing space coordinate conversion on the track points on the M candidate projection straight lines so as to determine the projection coordinates of the track points on the M second projection straight lines in a Cartesian coordinate system;
the second determining subunit is used for performing space coordinate conversion on the track points on the first projection navigation route so as to determine the projection coordinates of the track points on the second projection navigation route in the Cartesian coordinate system;
the acquisition subunit is used for acquiring M target included angles, wherein the target included angles are included angles between the second projection straight line and the second projection navigation route;
and the calculating subunit is used for calculating the first guide line based on the M target included angles.
For a specific embodiment, reference may be made to the example shown in the display method of the navigation route, which is not described herein again in this example.
As an alternative, as shown in fig. 9, the method includes:
a second obtaining unit 902, configured to obtain a second live-action image collected by the target terminal after performing spatial coordinate transformation on a point on the first guide line to determine a display coordinate of a track point on the second navigation route, where a collection time of the second live-action image is later than a collection time of the first live-action image;
a second determining unit 904 configured to calculate a second guideline based on K candidate straight lines in a case where the K candidate straight lines are identified in the second live-action image after performing spatial coordinate conversion on the point on the first guideline to determine the display coordinates of the track point on the second navigation route;
a third determining unit 906 configured to perform spatial coordinate conversion on the points on the second guide line to update the display coordinates of the trajectory points on the second navigation line after performing spatial coordinate conversion on the points on the first guide line to determine the display coordinates of the trajectory points on the second navigation line;
a fourth determining unit 908, configured to display the second navigation route on the second live-action image according to the updated display coordinates after performing spatial coordinate conversion on the point on the first guide line to determine the display coordinates of the track point on the second navigation route.
For a specific embodiment, reference may be made to the example shown in the display method of the navigation route, which is not described herein again in this example.
As an optional scheme, the method further comprises the following steps:
the first execution unit is used for executing image filtering operation on the acquired first live-action image before calculating a first guide line based on the N candidate straight lines, wherein the image filtering operation is used for removing noise points of the first live-action image;
and a second execution unit configured to, before calculating the first guideline based on the N candidate straight lines, perform a straight line detection operation on the first real-scene image after performing the image filtering operation in a case where the first real-scene image satisfies a detection condition, wherein the straight line detection operation is used to identify the candidate straight lines.
For a specific embodiment, reference may be made to the example shown in the display method of the navigation route, which is not described herein again in this example.
According to still another aspect of an embodiment of the present invention, there is further provided an electronic device for implementing the display method of the navigation route, as shown in fig. 10, the electronic device includes a memory 1002 and a processor 1004, the memory 1002 stores a computer program, and the processor 1004 is configured to execute the steps in any one of the method embodiments through the computer program.
Optionally, in this embodiment, the electronic device may be located in at least one network device of a plurality of network devices of a computer network.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring a live-action navigation request triggered in a target client of a target terminal, wherein the live-action navigation request is used for requesting to display a target live-action navigation route on a live-action image acquired by the target terminal, and the target live-action navigation route is used for guiding the target terminal to move from the current position to the target position;
s2, responding to the live-action navigation request, acquiring a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in the target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from the current position to the target position;
s3, when N candidate straight lines are identified in the first live-action image, calculating a first guideline based on the N candidate straight lines, where a second direction indicated by the first guideline and a first direction indicated by the first navigation route satisfy a first preset relationship, and N is a positive integer;
s4, performing space coordinate transformation on the points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is marked in the first live-action image and guides the target terminal to move along the first direction from the current position;
s5, the second navigation route is displayed on the first live view image in accordance with the display coordinates.
Alternatively, it can be understood by those skilled in the art that the structure shown in fig. 10 is only an illustration, and the electronic device may also be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palmtop computer, and a Mobile Internet Device (MID), a PAD, and the like. Fig. 10 is a diagram illustrating a structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, etc.) than shown in FIG. 10, or have a different configuration than shown in FIG. 10.
The memory 1002 may be used to store software programs and modules, such as program instructions/modules corresponding to the method and apparatus for displaying a navigation route in the embodiment of the present invention, and the processor 1004 executes various functional applications and data processing by running the software programs and modules stored in the memory 1002, that is, implements the method for displaying a navigation route. The memory 1002 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 1002 may further include memory located remotely from the processor 1004, which may be connected to the terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. The memory 1002 may be specifically, but not limited to, used to store information such as the first live view image, the first navigation route, and the second navigation route. As an example, as shown in fig. 10, the memory 1002 may include, but is not limited to, a first obtaining unit 802, a response unit 804, a calculating unit 806, a first determining unit 808, and a display unit 810 in the display device of the navigation route. In addition, the navigation device may further include, but is not limited to, other module units in the display device of the navigation route, which is not described in detail in this example.
Optionally, the above-mentioned transmission device 1006 is used for receiving or sending data via a network. Examples of the network may include a wired network and a wireless network. In one example, the transmission device 1006 includes a Network adapter (NIC) that can be connected to a router via a Network cable and other Network devices so as to communicate with the internet or a local area Network. In one example, the transmission device 1006 is a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
In addition, the electronic device further includes: a display 1008 for displaying the information such as the first live view image, the first navigation route, and the second navigation route; and a connection bus 1010 for connecting the respective module parts in the above-described electronic apparatus.
In other embodiments, the terminal device or the server may be a node in a distributed system, where the distributed system may be a blockchain system, and the blockchain system may be a distributed system formed by connecting a plurality of nodes through a network communication. The nodes may form a Peer-To-Peer (P2P) network, and any type of computing device, such as a server, a terminal, and other electronic devices, may become a node in the blockchain system by joining the Peer-To-Peer network.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. A processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to execute the method for displaying a navigation route, wherein the computer program is configured to execute the steps in any of the method embodiments described above.
Alternatively, in the present embodiment, the above-mentioned computer-readable storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring a live-action navigation request triggered in a target client of a target terminal, wherein the live-action navigation request is used for requesting to display a target live-action navigation route on a live-action image acquired by the target terminal, and the target live-action navigation route is used for guiding the target terminal to move from the current position to the target position;
s2, responding to the live-action navigation request, acquiring a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in the target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from the current position to the target position;
s3, when N candidate straight lines are identified in the first live-action image, calculating a first guideline based on the N candidate straight lines, where a second direction indicated by the first guideline and a first direction indicated by the first navigation route satisfy a first preset relationship, and N is a positive integer;
s4, performing space coordinate transformation on the points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is marked in the first live-action image and guides the target terminal to move along the first direction from the current position;
s5, the second navigation route is displayed on the first live view image in accordance with the display coordinates.
Alternatively, in this embodiment, a person skilled in the art may understand that all or part of the steps in the methods of the foregoing embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing one or more computer devices (which may be personal computers, servers, network devices, etc.) to execute all or part of the steps of the method according to the embodiments of the present invention.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A display method of a navigation route, comprising:
acquiring a live-action navigation request triggered in a target client of a target terminal, wherein the live-action navigation request is used for requesting to display a target live-action navigation route on a live-action image acquired by the target terminal, and the target live-action navigation route is used for guiding the target terminal to move from a current position to a target position;
responding to the live-action navigation request, acquiring a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in a target map, wherein the first navigation route is a movement prompting track which is identified in the target map and moves from the current position to the target position;
under the condition that N candidate straight lines are identified in the first live-action image, calculating a first guide line based on the N candidate straight lines, wherein a second direction indicated by the first guide line and a first direction indicated by the first navigation route meet a first preset relation, and N is a positive integer;
performing space coordinate conversion on points on the first guide line to determine display coordinates of track points on a second navigation route, wherein the second navigation route is a navigation prompt line which is identified in the first live-action image and guides the target terminal to move from the current position along the first direction;
and displaying the second navigation route on the first live-action image according to the display coordinates.
2. The method of claim 1, wherein said calculating a first guideline based on the N candidate lines comprises:
performing space coordinate conversion on the points on the N candidate straight lines to determine projection coordinates of track points on N first projection straight lines in a three-dimensional coordinate system;
performing space coordinate conversion on points on the first navigation route to determine projection coordinates of track points on a first projection navigation route in the three-dimensional coordinate system;
calculating the first guideline based on the first projected navigation route.
3. The method according to claim 2, wherein the performing spatial coordinate transformation on the points on the N candidate straight lines to determine the projection coordinates of the trajectory point on the N first projection straight lines in the three-dimensional coordinate system comprises:
acquiring first coordinate data of points on the N candidate straight lines on a two-dimensional coordinate system corresponding to the first live-action image, wherein the first coordinate data corresponds to camera internal parameters of the target terminal;
calculating a conversion matrix of the two-dimensional coordinate system and the three-dimensional coordinate system, wherein the conversion matrix corresponds to the camera external parameters of the target terminal;
and converting the first coordinate data into second coordinate data based on the conversion matrix, wherein the second coordinate data is projection coordinate data of track points on the N first projection straight lines in the three-dimensional coordinate system.
4. The method of claim 2, wherein said calculating the first guideline based on the first projected navigation route comprises:
obtaining M candidate projection straight lines, wherein the N first projection straight lines comprise the M candidate projection straight lines, and an included angle between the candidate projection straight lines and the first projection navigation route meets a second preset relation;
and calculating the first guide line based on the first projection navigation route and the M candidate projection straight lines.
5. The method of claim 4, wherein said calculating the first guideline based on the first projected navigation route and the M candidate projected straight lines comprises:
performing space coordinate conversion on the track points on the M candidate projection straight lines to determine projection coordinates of the track points on M second projection straight lines in a Cartesian coordinate system;
performing space coordinate conversion on track points on the first projection navigation route to determine projection coordinates of the track points on a second projection navigation route in the Cartesian coordinate system;
acquiring M target included angles, wherein the target included angles are included angles between the second projection straight line and the second projection navigation route;
and calculating the first guide line based on the M target included angles.
6. The method according to any one of claims 1 to 5, comprising, after said spatially transforming the coordinates of the points on the first guideline to determine the display coordinates of the trajectory points on the second navigation route:
acquiring a second real image acquired by the target terminal, wherein the acquisition time of the second real image is later than that of the first real image;
calculating a second guideline based on the K candidate lines in a case where the K candidate lines are identified in the second live-action image;
performing space coordinate conversion on the points on the second guide line to update the display coordinates of the track points on the second navigation route;
and displaying the second navigation route on the second live-action image according to the updated display coordinates.
7. The method according to any one of claims 1 to 5, wherein before said calculating a first guideline based on said N candidate lines, further comprising:
performing an image filtering operation on the acquired first live-action image, wherein the image filtering operation is used for removing noise of the first live-action image;
and in the case that the first real image after the image filtering operation is performed meets the detection condition, performing a straight line detection operation on the first real image, wherein the straight line detection operation is used for identifying the candidate straight line.
8. A display device for a navigation route, comprising:
the system comprises a first acquisition unit, a second acquisition unit and a third acquisition unit, wherein the first acquisition unit is used for acquiring a live-action navigation request triggered in a target client of a target terminal, the live-action navigation request is used for requesting a target live-action navigation route to be displayed on a live-action image acquired by the target terminal, and the target live-action navigation route is used for guiding the target terminal to move from a current position to a target position;
a response unit, configured to, in response to the live-action navigation request, obtain a first live-action image currently acquired by the target terminal and a first navigation route generated for the target terminal in a target map, where the first navigation route is a movement prompt track that is identified in the target map and moves from the current position to the target position;
a calculating unit, configured to calculate a first guideline based on N candidate straight lines when the N candidate straight lines are identified in the first live-action image, where a second direction indicated by the first guideline and a first direction indicated by the first navigation route satisfy a first preset relationship, and N is a positive integer;
a first determining unit, configured to perform spatial coordinate transformation on a point on the first guide line to determine a display coordinate of a track point on a second navigation route, where the second navigation route is a navigation prompt line that is identified in the first live-action image and that directs the target terminal to move from the current position along the first direction;
and the display unit is used for displaying the second navigation route on the first live-action image according to the display coordinates.
9. A computer-readable storage medium, comprising a stored program, wherein the program is operable to perform the method of any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method of any of claims 1 to 7 by means of the computer program.
CN202011419148.0A 2020-12-07 2020-12-07 Navigation route display method and device, storage medium and electronic equipment Active CN112556685B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011419148.0A CN112556685B (en) 2020-12-07 2020-12-07 Navigation route display method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011419148.0A CN112556685B (en) 2020-12-07 2020-12-07 Navigation route display method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112556685A CN112556685A (en) 2021-03-26
CN112556685B true CN112556685B (en) 2022-03-25

Family

ID=75059572

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011419148.0A Active CN112556685B (en) 2020-12-07 2020-12-07 Navigation route display method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112556685B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113483774B (en) * 2021-06-29 2023-11-03 阿波罗智联(北京)科技有限公司 Navigation method, navigation device, electronic equipment and readable storage medium
CN113961065B (en) * 2021-09-18 2022-10-11 北京城市网邻信息技术有限公司 Navigation page display method and device, electronic equipment and storage medium
CN115493614B (en) * 2022-11-21 2023-03-24 泽景(西安)汽车电子有限责任公司 Method and device for displaying flight path line, storage medium and electronic equipment
CN117128959A (en) * 2023-04-18 2023-11-28 荣耀终端有限公司 Car searching navigation method, electronic equipment, server and system
CN116793382B (en) * 2023-06-25 2024-02-02 江苏泽景汽车电子股份有限公司 Lane navigation information display method and device, electronic equipment and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008309529A (en) * 2007-06-12 2008-12-25 Panasonic Corp Navigation system, navigation method and program for navigation
CN104180814A (en) * 2013-05-22 2014-12-03 北京百度网讯科技有限公司 Navigation method in live-action function on mobile terminal, and electronic map client
CN104159036B (en) * 2014-08-26 2018-09-18 惠州Tcl移动通信有限公司 A kind of display methods and capture apparatus of image orientation information
CN106092121B (en) * 2016-05-27 2017-11-24 百度在线网络技术(北京)有限公司 Automobile navigation method and device
TWI657409B (en) * 2017-12-27 2019-04-21 財團法人工業技術研究院 Superimposition device of virtual guiding indication and reality image and the superimposition method thereof
CN111044061B (en) * 2018-10-12 2023-03-28 腾讯大地通途(北京)科技有限公司 Navigation method, device, equipment and computer readable storage medium
CN109931945B (en) * 2019-04-02 2021-07-06 百度在线网络技术(北京)有限公司 AR navigation method, device, equipment and storage medium
CN111623795B (en) * 2020-05-28 2022-04-15 阿波罗智联(北京)科技有限公司 Live-action navigation icon display method, device, equipment and medium

Also Published As

Publication number Publication date
CN112556685A (en) 2021-03-26

Similar Documents

Publication Publication Date Title
CN112556685B (en) Navigation route display method and device, storage medium and electronic equipment
CN112894832B (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
CN108986161B (en) Three-dimensional space coordinate estimation method, device, terminal and storage medium
JP6768156B2 (en) Virtually enhanced visual simultaneous positioning and mapping systems and methods
US9270891B2 (en) Estimation of panoramic camera orientation relative to a vehicle coordinate frame
CN111337947A (en) Instant mapping and positioning method, device, system and storage medium
CN110176032B (en) Three-dimensional reconstruction method and device
KR100855657B1 (en) System for estimating self-position of the mobile robot using monocular zoom-camara and method therefor
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
KR102200299B1 (en) A system implementing management solution of road facility based on 3D-VR multi-sensor system and a method thereof
CN111709973B (en) Target tracking method, device, equipment and storage medium
JP4132068B2 (en) Image processing apparatus, three-dimensional measuring apparatus, and program for image processing apparatus
CN112365549B (en) Attitude correction method and device for vehicle-mounted camera, storage medium and electronic device
CN110926478B (en) AR navigation route deviation rectifying method and system and computer readable storage medium
CN110660098A (en) Positioning method and device based on monocular vision
CN110986969A (en) Map fusion method and device, equipment and storage medium
CN115272494B (en) Calibration method and device for camera and inertial measurement unit and computer equipment
CN112700486A (en) Method and device for estimating depth of road lane line in image
CN113587934A (en) Robot, indoor positioning method and device and readable storage medium
CN110749308A (en) SLAM-oriented outdoor positioning method using consumer-grade GPS and 2.5D building models
JP2023503750A (en) ROBOT POSITIONING METHOD AND DEVICE, DEVICE, STORAGE MEDIUM
He et al. Three-point-based solution for automated motion parameter estimation of a multi-camera indoor mapping system with planar motion constraint
US11557059B2 (en) System and method for determining position of multi-dimensional object from satellite images
CN112750164B (en) Lightweight positioning model construction method, positioning method and electronic equipment
KR20210133583A (en) Apparatus for building map using gps information and lidar signal and controlling method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40040680

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant