CN114413919A - Navigation method, device, equipment and computer storage medium - Google Patents

Navigation method, device, equipment and computer storage medium Download PDF

Info

Publication number
CN114413919A
CN114413919A CN202111681859.XA CN202111681859A CN114413919A CN 114413919 A CN114413919 A CN 114413919A CN 202111681859 A CN202111681859 A CN 202111681859A CN 114413919 A CN114413919 A CN 114413919A
Authority
CN
China
Prior art keywords
information
navigation
target
point
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111681859.XA
Other languages
Chinese (zh)
Inventor
刘万凯
张维智
班如庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202111681859.XA priority Critical patent/CN114413919A/en
Publication of CN114413919A publication Critical patent/CN114413919A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application discloses a navigation method, a navigation device, navigation equipment and a computer storage medium. The method is applied to augmented reality equipment, and comprises the following steps: the method comprises the steps of obtaining connectivity information and distance information among a plurality of road sign nodes in a point cloud map, and constructing the point cloud map according to a target environment; determining a starting point and an end point from a plurality of road sign nodes; generating a target navigation path according to the starting point, the end point, the connectivity information and the distance information; and displaying the navigation identifier on the augmented reality equipment based on the target navigation path. Therefore, path navigation between the starting point and the end point is generated through the connectivity information and the distance information between the road sign nodes, dynamic planning of the path is achieved, and the navigation path is displayed on the augmented reality equipment, so that not only is the calculation complexity reduced, but also the navigation efficiency is improved.

Description

Navigation method, device, equipment and computer storage medium
Technical Field
The present application relates to the field of positioning and navigation technologies, and in particular, to a navigation method, apparatus, device, and computer storage medium.
Background
With the development of society and the construction of cities, the intelligent positioning and navigation system for indoor closed environment is a wide social demand of people. The synchronous positioning and Mapping (SLAM) system is characterized in that when equipment is in an unknown environment, motion states and surrounding environment information can be acquired through a sensor, a three-dimensional space of the surrounding environment is reconstructed in real time, and the equipment is positioned at the same time.
However, in the related art, the mainstream scheme of three-dimensional space navigation is based on mesh (mesh) search after three-dimensional reconstruction, and three-dimensional mesh reconstruction is performed on a movable path in advance, and then path navigation is performed on the mesh. However, in the three-dimensional reconstruction process, a spatial grid patch needs to be reconstructed in advance, and the data volume of the navigation map is huge, so that the consumption of equipment computing resources is high, and the navigation effect is not ideal.
Disclosure of Invention
The application provides a navigation method, a navigation device and a computer storage medium, which can mark road sign nodes in a map, obtain a navigation path through dynamic path search, and display the navigation path on augmented reality equipment, thereby not only reducing the computational complexity, but also improving the navigation efficiency.
In order to achieve the purpose, the technical scheme of the application is realized as follows:
in a first aspect, an embodiment of the present application provides a navigation method applied to an augmented reality device, where the method includes:
the method comprises the steps of obtaining connectivity information and distance information among a plurality of road sign nodes in a point cloud map, wherein the point cloud map is constructed according to a target environment;
determining a starting point and an end point from the plurality of landmark nodes;
generating a target navigation path according to the starting point, the end point, the connectivity information and the distance information;
and displaying a navigation identifier on the augmented reality equipment based on the target navigation path.
In some embodiments, the determining a starting point and an ending point from the plurality of landmark nodes comprises:
acquiring the position information of the augmented reality equipment on the target point cloud map, and determining a landmark node corresponding to the position information as the starting point;
and determining the destination of the road sign node corresponding to the user input information based on the user input information.
In some embodiments, the obtaining the location information of the augmented reality device on the target point cloud map comprises:
acquiring current visual information of the augmented reality device;
and calculating the pose of the augmented reality equipment according to the visual information and the point cloud map to obtain the position information of the augmented reality equipment.
In some embodiments, the generating the target navigation path based on the start point, the end point, the connectivity information, and the distance information comprises: determining a target landmark node between the starting point and the ending point based on the starting point and the ending point;
acquiring a corresponding target adjacency matrix from the adjacency matrix; wherein the target adjacency matrix comprises a starting point, an end point and a target landmark node;
determining a shortest path from the starting point to the end point through a shortest path algorithm by using the distance between two points in the target adjacency matrix as an edge weight based on the target adjacency matrix;
determining the shortest path as the target navigation path.
In some embodiments, the displaying, on the augmented reality device, a navigation identifier based on the target navigation path includes:
determining direction information corresponding to the navigation identification according to the target navigation path;
acquiring point cloud information of the ground by using a depth-sensing camera, and determining a plane where the ground is located in the augmented reality equipment according to the point cloud information;
determining the display position and the display posture of the navigation mark in the augmented reality equipment based on the plane where the ground is located and the direction information;
and displaying the navigation identifier according to the display position and the display posture.
In some embodiments, before the obtaining connectivity information and distance information between a plurality of landmark nodes in the point cloud map, the method further comprises:
acquiring sensor data corresponding to the target environment through a terminal sensor;
utilizing an instant positioning and map building system to carry out map reconstruction processing on the sensor data to obtain an initial point cloud map;
determining a plurality of landmark nodes, and determining connectivity information and distance information among the landmark nodes;
and storing the initial point cloud map, the plurality of landmark nodes and connectivity information and distance information among the landmark nodes as the point cloud map.
In some embodiments, before the obtaining connectivity information and distance information between a plurality of landmark nodes in the point cloud map, the method further comprises:
and generating an adjacency matrix according to the connectivity information and the distance information among the plurality of road sign nodes.
In a second aspect, an embodiment of the present application provides a navigation device, including:
the acquisition unit is configured to acquire connectivity information and distance information among a plurality of landmark nodes in a point cloud map, and the point cloud map is constructed according to a target environment;
a determination unit configured to determine a start point and an end point from the plurality of landmark nodes;
the navigation unit is configured to generate a target navigation path according to the starting point, the end point, the connectivity information and the distance information;
a display unit configured to display a navigation identifier on the augmented reality device based on the target navigation path.
In a third aspect, an embodiment of the present application provides an augmented reality device, including:
a memory for storing a computer program capable of running on the processor;
a processor for performing the navigation method according to any of the first aspect when the computer program is run.
In a fourth aspect, an embodiment of the present application provides a computer storage medium storing a computer program, which when executed by at least one processor implements the navigation method according to any one of the first aspect.
The embodiment of the application provides a navigation method, a navigation device, navigation equipment and a computer storage medium. The method comprises the steps that a point cloud map is built according to a target environment by obtaining connectivity information and distance information among a plurality of road sign nodes in the point cloud map; determining a starting point and an end point from a plurality of road sign nodes; generating a target navigation path according to the starting point, the end point, the connectivity information and the distance information; and displaying the navigation identifier on the augmented reality equipment based on the target navigation path. Therefore, path navigation between the starting point and the end point is generated through connectivity information and distance information between the road sign nodes, dynamic planning of the path is achieved, and extra three-dimensional reconstruction is avoided, so that not only is the calculation complexity reduced, but also the navigation efficiency is improved; in addition, the navigation identification is displayed on the augmented reality equipment in real time, so that the navigation efficiency is further improved, and the navigation use experience of a user can be improved.
Drawings
Fig. 1 is a schematic diagram of a dynamic navigation effect based on three-dimensional reconstruction provided in the related art;
FIG. 2 is a schematic diagram of a navigation effect of a full-route label provided in the related art;
fig. 3 is a schematic flowchart of a navigation method according to an embodiment of the present application;
FIG. 4 is a schematic flow chart illustrating another navigation method according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram illustrating annotation of landmark node information according to an embodiment of the present application;
FIG. 6 is a diagram illustrating an example of a adjacency matrix according to an embodiment of the present application;
FIG. 7 is a flowchart illustrating another navigation method according to an embodiment of the present application;
fig. 8 is a schematic view illustrating a display effect of a navigation mark according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a navigation device according to an embodiment of the present application;
fig. 10 is a schematic diagram of a specific hardware structure of an augmented reality device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of an augmented reality device according to an embodiment of the present application.
Detailed Description
So that the manner in which the features and elements of the present embodiments can be understood in detail, a more particular description of the embodiments, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict. It should also be noted that reference to the terms "first \ second \ third" in the embodiments of the present application is only used for distinguishing similar objects and does not represent a specific ordering for the objects, and it should be understood that "first \ second \ third" may be interchanged with a specific order or sequence where possible so that the embodiments of the present application described herein can be implemented in an order other than that shown or described herein.
It can be understood that SLAM, i.e. synchronized positioning and mapping, works on the principle of: when the equipment is in an unknown environment, the motion state and the surrounding environment information are acquired through the sensor of the equipment, the three-dimensional structure of the surrounding environment is reconstructed in real time, and meanwhile, the equipment is positioned. Relocation is an important module in utilizing SLAM systems: when the device enters the mapped environment, the 6 degree of freedom pose of the device is calculated using the current visual information and the previous map information.
In the related art, a solution for three-dimensional space navigation currently exists. One of the schemes is based on mesh (mesh) search after three-dimensional reconstruction, and three-dimensional mesh reconstruction is performed on a movable path in advance, and then path search is performed on the mesh. Another scheme is to record a route from a starting point to an end point in advance, so that point-to-point navigation of a fixed route can be realized.
Exemplarily, referring to fig. 1, a schematic diagram of a dynamic navigation effect based on three-dimensional reconstruction is shown. As shown in fig. 1, a dynamic navigation scheme based on a three-dimensional mesh patch can perform real-time path planning, but three-dimensional reconstruction requires reconstruction of a spatial mesh patch in advance, but the navigation map has a large data volume and consumes high computing resources.
Referring to fig. 2, a schematic diagram of a navigation effect of a full-route annotation is shown. As shown in fig. 2, the whole-course route guidance can be realized by adopting the whole-course route labeling mode, but dynamic route planning cannot be realized, and each route needs to be manually labeled, so that the workload is large, and manpower resources are wasted.
Based on this, the embodiment of the present application provides a navigation method, which is applied to augmented reality equipment, and the basic idea of the method is: the method comprises the steps of obtaining connectivity information and distance information among a plurality of road sign nodes in a point cloud map, and constructing the point cloud map according to a target environment; determining a starting point and an end point from a plurality of road sign nodes; generating a target navigation path according to the starting point, the end point, the connectivity information and the distance information; and displaying the navigation identifier on the augmented reality equipment based on the target navigation path. Therefore, path navigation between the starting point and the end point is generated through connectivity information and distance information between the road sign nodes, dynamic planning of the path is achieved, and extra three-dimensional reconstruction is avoided, so that not only is the calculation complexity reduced, but also the navigation efficiency is improved; in addition, the navigation identification is displayed on the augmented reality equipment in real time, so that the navigation efficiency is further improved, and the navigation use experience of a user can be improved.
In an embodiment of the present application, referring to fig. 3, a flowchart of a navigation method provided in the embodiment of the present application is shown and applied to an augmented reality device. As shown in fig. 3, the method may include:
s101: and acquiring connectivity information and distance information among a plurality of road sign nodes in the point cloud map, and constructing the point cloud map according to the target environment.
It should be noted that the navigation method provided by the embodiment of the present application may be applied to an electronic device with a navigation requirement. Here, the electronic device may be a Personal Digital Assistant (PDA), an Augmented Reality (AR) virtual device, a Virtual Reality (VR) virtual device, or the like, such as a computer, a smart phone, a tablet computer, a notebook computer, a palmtop computer, and the like. The embodiment of the present application takes an augmented reality device as an example for description, but is not particularly limited.
It should be further noted that, in the augmented reality device, connectivity information and distance information between a plurality of landmark nodes in the point cloud map need to be acquired first, so as to determine a navigation path in the following. Here, a point cloud map may be constructed according to the target environment.
S102: a starting point and an ending point are determined from the plurality of landmark nodes.
It should be noted that the starting point and the ending point are two end points of the navigation path, and specifically, the starting point and the ending point may be any one of a plurality of landmark nodes in the point cloud map. In some specific embodiments, the starting point may be a set fixed position, and the ending points are different landmark nodes; illustratively, taking the navigation of the hotel meal delivery devices as an example, the starting points are all kitchens, and the ending points can be various rooms.
In some embodiments, for S102, the determining a starting point and an ending point from the plurality of landmark nodes may include:
acquiring the position information of the augmented reality equipment on the target point cloud map, and determining a landmark node corresponding to the position information as the starting point;
and determining the destination of the road sign node corresponding to the user input information based on the user input information.
It should be further noted that when the location information of the augmented reality device is on a landmark node, the landmark node is determined as a starting point, and when the location information of the augmented reality device is not on the landmark node, a landmark node closest to the location of the augmented reality device is determined as a starting point. Specifically, in the case of the determination of the starting point position by a hotel meal delivery facility or the like, the step of determining the starting point may be omitted.
Further, in some embodiments, the obtaining the location information of the augmented reality device on the target point cloud map may include:
acquiring current visual information of the augmented reality device;
and calculating the pose of the augmented reality equipment according to the visual information and the point cloud map to obtain the position information of the augmented reality equipment.
It should be noted that a large amount of picture information of the target environment is stored in the point cloud map, and after the augmented reality device acquires the current visual information, the position information of the augmented reality device is determined according to the similarity of the pictures by comparing the environmental picture in the current visual information with the picture information in the point cloud map.
It should be further noted that, if the position information shows that the position of the augmented reality device is on a landmark node, the landmark node is determined as a starting point, and if the position information shows that the position of the augmented reality device is not on the landmark node, a landmark node closest to the augmented reality device is determined as a starting point.
In this way, a start point and an end point are determined from the location information of the augmented reality device and the user input information for subsequent generation of a navigation path.
S103: and generating a target navigation path according to the starting point, the end point, the connectivity information and the distance information.
It should be noted that, in order to ensure the efficiency of the navigation path, when the target navigation path is generated, the navigation path with the shortest overall distance may be selected, and for example, the target navigation path may be determined by using a shortest path algorithm.
In some embodiments, for S103, the recording the connectivity information and the distance information in an adjacency matrix, and the generating the target navigation path according to the starting point, the ending point, the connectivity information, and the distance information may include:
determining a target landmark node between the starting point and the ending point based on the starting point and the ending point;
acquiring a corresponding target adjacency matrix from the adjacency matrix; wherein the target adjacency matrix comprises a starting point, an end point and a target landmark node;
determining a shortest path from the starting point to the end point through a shortest path algorithm by using the distance between two points in the target adjacency matrix as an edge weight based on the target adjacency matrix;
determining the shortest path as the target navigation path.
It should be noted that, in the process of obtaining a target landmark node, a landmark node adjacent to a starting point is obtained from the starting point, connectivity between the landmark node and the starting point is determined, and under the condition that the landmark node is communicated with the starting point and the direction from the starting point to the landmark node is consistent with the direction to the end point, the landmark node is determined to be the target landmark node, and then a next target landmark node is determined based on the target landmark node, which is similar to the foregoing method, and is not described herein again.
It should be noted that after all the target landmark nodes are determined, a target adjacency matrix including a start point, an end point and the target landmark nodes is obtained from an adjacency matrix corresponding to the point cloud map, and connectivity information and distance information between the start point, the end point and the target landmark nodes are recorded in the target adjacency matrix, where in the case of connectivity between two points, the distance between two points is recorded in the target adjacency matrix, in the case of non-connectivity between two points, the record in the target adjacency matrix is empty, and in the case of coincidence between two points, the record in the target adjacency matrix is 0.
Thus, the target adjacent matrix is obtained according to the starting point and the end point, the shortest path between the starting point and the end point is obtained based on the target adjacent matrix, and finally the target navigation path is obtained.
S104: and displaying the navigation identifier on the augmented reality equipment based on the target navigation path.
It should be noted that before displaying the navigation identifier, it is necessary to determine whether a depth-sensing camera is present on the augmented reality device, and in the case of the depth-sensing camera, the navigation identifier may be displayed according to the ground plane acquired by the depth-sensing camera, and in the case of no depth-sensing camera, the navigation identifier is displayed according to a fixed distance.
In some embodiments, for S104, the displaying, on the augmented reality device, a navigation identifier based on the target navigation path may include:
determining direction information corresponding to the navigation identification according to the target navigation path;
acquiring point cloud information of the ground by using a depth-sensing camera, and determining a plane where the ground is located in the augmented reality equipment according to the point cloud information;
determining the display position and the display posture of the navigation mark in the augmented reality equipment based on the plane where the ground is located and the direction information;
and displaying the navigation identifier according to the display position and the display posture.
It should be noted that, through the depth-sensing camera, user dynamic information can be captured during photographing and game experience, so that various interactions are realized, specifically, in the embodiment of the application, the depth-sensing camera can acquire point cloud information of the ground in the current state, a plane where the ground is located can be determined on the basis of the point cloud information of the ground, or a distance between the augmented reality device and the ground in reality is converted into the augmented reality device, and then the navigation identifier is accurately displayed in the augmented reality device, so that the navigation identifier can be attached to the ground in the eyes of a user, the immersion effect of navigation is better, and the overall experience of the user is better.
Further, in some embodiments, before the obtaining connectivity information and distance information between a plurality of landmark nodes in the point cloud map, the method may further include:
acquiring sensor data corresponding to the target environment through a terminal sensor;
utilizing an instant positioning and map building system to carry out map reconstruction processing on the sensor data to obtain an initial point cloud map;
determining a plurality of landmark nodes, and determining connectivity information and distance information among the landmark nodes;
and storing the initial point cloud map, the plurality of landmark nodes and connectivity information and distance information among the landmark nodes as the point cloud map.
It should be noted that, when determining the landmark nodes, positions where turns and intersections exist in the initial point cloud map are specifically determined as the landmark nodes, and after determining all the landmark nodes, it is required to ensure that paths between any two landmark nodes are straight lines, and connectivity information and distance information between the landmark nodes may be set according to specific requirements.
For example, if it is desired that the augmented reality device does not pass through a certain position, the connectivity information between two landmark nodes passing through a certain position may be directly set to be disconnected; if it is desired that the augmented reality device pass somewhere as little as possible, the distance between the two landmark nodes passing somewhere may be set larger.
Further, in some embodiments, before the obtaining connectivity information and distance information between a plurality of landmark nodes in the point cloud map, the method may further include: and generating an adjacency matrix according to the connectivity information and the distance information among the plurality of road sign nodes.
It should be noted that the adjacency matrix is determined according to the landmark nodes and connectivity information and distance information between the landmark nodes, and the adjacency matrix includes each landmark node as a node, and the distance information between the landmark nodes as an edge, and is used for realizing storage and sorting of information corresponding to the landmark nodes so as to generate the navigation path through the shortest path algorithm.
The embodiment of the application provides a navigation method, which is applied to augmented reality equipment, and is used for acquiring connectivity information and distance information among a plurality of road sign nodes in a point cloud map, wherein the point cloud map is constructed according to a target environment; determining a starting point and an end point from a plurality of road sign nodes; generating a target navigation path according to the starting point, the end point, the connectivity information and the distance information; and displaying the navigation identifier on the augmented reality equipment based on the target navigation path. Therefore, path navigation between the starting point and the end point is generated through connectivity information and distance information between the road sign nodes, the navigation efficiency is improved, meanwhile, the navigation identification can be displayed on the augmented reality equipment in real time, and the navigation use experience of a user is improved.
In another embodiment of the present application, based on the navigation method described in the foregoing embodiment, a graph search method may be adopted to dynamically search for a shortest path between two points for navigation by using a distance between a landmark node and a landmark node as a weight. At this time, path searching can be dynamically realized only by labeling the nodes of the road sign nodes in advance without additionally performing three-dimensional reconstruction; the method specifically comprises two aspects of a mapping process and a positioning process.
In the process of creating the map, refer to fig. 4, which shows a schematic flow chart of another navigation method provided by the embodiment of the present application. Describing in detail the implementation of the mapping process, as shown in fig. 4, the method may include:
s401: starting an SLAM system and starting a map building mode;
after the start of the SLAM system, S402 and S404 may be performed, respectively. For S402 and S404, the execution of the two is not in sequence, and may be executed in parallel, or may execute S402 first and then S404, or execute S402 first and then S404, but is not limited in particular.
S402: marking the landmark nodes and the target points, and distributing data types;
it should be noted that the data type may include a type of a landmark node, and in the case of determining the type of the landmark node, connectivity and distance between the landmark node and an adjacent landmark node need to be labeled.
S403: dynamically generating node weights and an adjacency matrix based on the three-dimensional coordinates of the road sign nodes;
it should be noted that the node weight is specifically determined according to the distance between a landmark node and an adjacent landmark node. The adjacency matrix is determined according to the landmark nodes, connectivity information between the landmark nodes and distance information, and the adjacency matrix comprises all the landmark nodes as nodes, and the distance information between the landmark nodes as edges, and is used for realizing the storage and the arrangement of the information corresponding to the landmark nodes so as to generate a navigation path through a shortest path algorithm.
S404: point cloud map reconstruction;
it should be noted that the point cloud map can be reconstructed according to the target environment. In the point cloud map construction, a starting point, an end point and road sign nodes are marked, the connectivity of each node is edited after marking, the key road sign nodes are mainly reconstructed, and the points in the point cloud map are relatively sparse, so that the point cloud map can be also called as a point cloud map.
S405: and storing the point cloud map, the road sign node labels and the adjacency matrix.
Specifically, taking AR glasses as an example, for the mapping process, the method may include: starting the AR glasses, labeling a starting point, an end point and landmark nodes in the space, and editing connectivity of each landmark node after labeling, for example, referring to fig. 5, labeling landmark node labels as a starting point (represented by a grid line), an intermediate node (i.e., a target landmark node in the foregoing embodiment) or a target point (i.e., an end point in the foregoing embodiment), where the intermediate node is filled with gray, such as middle 1, middle 2, middle 3, and the like in fig. 5; the end points are filled with diagonal lines, such as end 1, end 2, end 3, etc. in fig. 5. In the case of connectivity between adjacent landmark nodes, distance information between two points is labeled as e1, e2, e3 … … en.
Further, the Euclidean distance of the space points is used for carrying out edge weight distribution, the distance between the road sign nodes serving as nodes and the road sign nodes serves as an edge to generate an adjacency matrix, and the point cloud map, the road sign node labels and the adjacency matrix are stored after editing is finished. Exemplarily, referring to fig. 6, it shows a specific form of the target adjacency matrix, in which the distance between two points is recorded in the target adjacency matrix in the case of communication between the two points, specifically recorded using e1, e2, e3 … …, in the case of non-communication between the two points, the record in the target adjacency matrix is empty, and in the case of coincidence between the two points, the record in the target adjacency matrix is 0.
In the positioning process, refer to fig. 7, which shows a flowchart of another navigation method provided in the embodiment of the present application. Describing in detail the implementation of the positioning process, as shown in fig. 7, the method may further include:
s701: and loading the adjacency matrix and the point cloud map, and inputting the position of the target point.
It should be noted that the point cloud map may be the point cloud map reconstructed in the above embodiment.
S702: the path is generated according to the adjacency matrix and a shortest path (dijkstra) shortest path algorithm.
S703: and judging whether the augmented reality equipment has a deep-sense camera or not.
Specifically, in the case where the augmented reality device has a depth-sensing camera, S704 may be executed; in the case where there is no depth-sensing camera in the augmented reality apparatus, S705 may be performed.
S704: the AR path is rendered on the ground.
S705: the AR path is rendered 1.5m below the augmented reality device.
Specifically, for the positioning process, it may include: and loading the adjacency matrix and the point cloud map, and calculating the shortest path to the end point by a dijkstra algorithm. And rendering the corresponding signposts in the spatial path to realize the navigation process. After the navigation path is determined, illustratively, referring to fig. 8, a navigation identifier may be displayed in the augmented reality device.
The embodiment of the application provides a navigation method, the specific implementation of the embodiment is elaborated based on the embodiment, and it can be seen that according to the technical scheme of the embodiment, in the graph building process, the landmark nodes are labeled, the connectivity information and the distance information between the landmark nodes and the adjacent landmark nodes are edited, in the positioning process, the path navigation between the starting point and the end point is generated through the connectivity information and the distance information between the landmark nodes, the navigation efficiency is improved, meanwhile, the navigation identification can be displayed on the augmented reality device in real time, and the navigation use experience of a user is improved.
In another embodiment of the present application, refer to fig. 9, which shows a schematic structural diagram of a navigation device provided in the embodiment of the present application. As shown in fig. 9, the navigation device 90 may include:
an obtaining unit 901 configured to obtain connectivity information and distance information between a plurality of landmark nodes in a point cloud map, where the point cloud map is constructed according to a target environment;
a determining unit 902 configured to determine a starting point and an end point from the plurality of landmark nodes;
a navigation unit 903 configured to generate a target navigation path according to the starting point, the end point, the connectivity information, and the distance information;
a display unit 904 configured to display a navigation identifier on the augmented reality device based on the target navigation path.
In some embodiments, the determining unit 902 is specifically configured to obtain location information of the augmented reality device on the target point cloud map, and determine a landmark node corresponding to the location information as the starting point; and determining the destination of the road sign node corresponding to the user input information based on the user input information.
In some embodiments, the determining unit 902 is specifically configured to obtain current visual information of the augmented reality device; and performing pose calculation on the augmented reality equipment according to the visual information and the point cloud map to obtain the position information of the augmented reality equipment.
In some embodiments, the connectivity information and the distance information are recorded in an adjacency matrix, and the navigation unit 903 is specifically configured to determine a target landmark node between the start point and the end point based on the start point and the end point; acquiring a corresponding target adjacency matrix from the adjacency matrix; wherein the target adjacency matrix comprises a starting point, an end point and a target landmark node; and determining a shortest path from the starting point to the end point by a shortest path algorithm by using the distance between two points in the target adjacency matrix as an edge weight based on the target adjacency matrix; and determining the shortest path as the target navigation path.
In some embodiments, the display unit 904 is specifically configured to determine, according to the target navigation path, direction information corresponding to the navigation identifier; acquiring point cloud information of the ground by using a depth sensing camera, and determining a plane where the ground is located in the augmented reality equipment according to the point cloud information; determining the display position and the display posture of the navigation mark in the augmented reality equipment based on the plane where the ground is located and the direction information; and displaying the navigation identifier according to the display position and the display posture.
In some embodiments, the navigation device 90 may further include a mapping unit 905 configured to obtain sensor data corresponding to the target environment through a terminal sensor; carrying out map reconstruction processing on the sensor data by utilizing an instant positioning and map building system to obtain an initial point cloud map; determining a plurality of road sign nodes, and determining connectivity information and distance information among the plurality of road sign nodes; and storing the initial point cloud map, the plurality of landmark nodes and connectivity information and distance information between the plurality of landmark nodes as the point cloud map.
In some embodiments, the graph creating unit 905 is further configured to generate an adjacency matrix according to connectivity information and distance information between the plurality of landmark nodes.
It is understood that in this embodiment, a "unit" may be a part of a circuit, a part of a processor, a part of a program or software, etc., and may also be a module, or may also be non-modular. Moreover, each component in the embodiment may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware or a form of a software functional module.
Based on the understanding that the technical solution of the present embodiment essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the present embodiment. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Accordingly, the present embodiments provide a computer storage medium storing a navigation program that, when executed by at least one processor, implements the steps of the method of any of the preceding embodiments.
Based on the composition of the navigation device 90 and the computer storage medium, refer to fig. 10, which shows a specific hardware structure diagram of an augmented reality device provided in an embodiment of the present application. As shown in fig. 10, the augmented reality device 100 may include: a communication interface 1001, a memory 1002, and a processor 1003; the various components are coupled together by a bus system 1004. It is understood that the bus system 1004 is used to enable communications among the components. The bus system 1004 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for the sake of clarity the various busses are labeled in fig. 10 as the bus system 1004. The communication interface 1001 is used for receiving and sending signals in the process of receiving and sending information with other external network elements;
a memory 1002 for storing a computer program capable of running on the processor 1003;
a processor 1003 configured to, when running the computer program, perform:
the method comprises the steps of obtaining connectivity information and distance information among a plurality of road sign nodes in a point cloud map, wherein the point cloud map is constructed according to a target environment;
determining a starting point and an end point from the plurality of landmark nodes;
generating a target navigation path according to the starting point, the end point, the connectivity information and the distance information;
and displaying a navigation identifier on the augmented reality equipment based on the target navigation path.
It is to be appreciated that the memory 1002 in the subject embodiment can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (ddr Data Rate SDRAM, ddr SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous chained SDRAM (Synchronous link DRAM, SLDRAM), and Direct memory bus RAM (DRRAM). The memory 1002 of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
And the processor 1003 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 1003. The Processor 1003 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1002, and the processor 1003 reads the information in the memory 1002 and performs the steps of the above method in combination with the hardware thereof.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, as another embodiment, the processor 1003 is further configured to execute the steps of the method in any one of the preceding embodiments when running the computer program.
In some embodiments, refer to fig. 11, which shows a schematic structural diagram of an augmented reality device 100 provided in an embodiment of the present application. As shown in fig. 11, the augmented reality apparatus 100 at least includes the navigation device 90 according to any one of the previous embodiments.
In the embodiment of the application, for the augmented reality device 100, connectivity information and distance information between a plurality of landmark nodes in a point cloud map are acquired, and the point cloud map is constructed according to a target environment; determining a starting point and an end point from a plurality of road sign nodes; generating a target navigation path according to the starting point, the end point, the connectivity information and the distance information; and displaying the navigation identifier on the augmented reality equipment based on the target navigation path. Therefore, path navigation between the starting point and the end point is generated through connectivity information and distance information between the road sign nodes, dynamic planning of the path is achieved, and extra three-dimensional reconstruction is avoided, so that not only is the calculation complexity reduced, but also the navigation efficiency is improved; in addition, the navigation identification is displayed on the augmented reality equipment in real time, so that the navigation efficiency is further improved, and the navigation use experience of a user can be improved.
It should be noted that, in the present application, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A navigation method is applied to augmented reality equipment, and comprises the following steps:
the method comprises the steps of obtaining connectivity information and distance information among a plurality of road sign nodes in a point cloud map, wherein the point cloud map is constructed according to a target environment;
determining a starting point and an end point from the plurality of landmark nodes;
generating a target navigation path according to the starting point, the end point, the connectivity information and the distance information;
and displaying a navigation identifier on the augmented reality equipment based on the target navigation path.
2. The method of claim 1, the determining a starting point and an ending point from the plurality of landmark nodes, comprising:
acquiring the position information of the augmented reality equipment on the target point cloud map, and determining a landmark node corresponding to the position information as the starting point;
and determining the destination of the road sign node corresponding to the user input information based on the user input information.
3. The method of claim 2, the obtaining location information of the augmented reality device on the target point cloud map, comprising:
acquiring current visual information of the augmented reality device;
and calculating the pose of the augmented reality equipment according to the visual information and the point cloud map to obtain the position information of the augmented reality equipment.
4. The method of claim 1, the connectivity information and the distance information recorded in an adjacency matrix, the generating a target navigation path from the start point, the end point, the connectivity information, and the distance information comprising:
determining a target landmark node between the starting point and the ending point based on the starting point and the ending point;
acquiring a corresponding target adjacency matrix from the adjacency matrix; wherein the target adjacency matrix comprises a starting point, an end point and a target landmark node;
determining a shortest path from the starting point to the end point through a shortest path algorithm by using the distance between two points in the target adjacency matrix as an edge weight based on the target adjacency matrix;
determining the shortest path as the target navigation path.
5. The method of any of claims 1 to 4, the displaying a navigation identifier on the augmented reality device based on the target navigation path, comprising:
determining direction information corresponding to the navigation identification according to the target navigation path;
acquiring point cloud information of the ground by using a depth-sensing camera, and determining a plane where the ground is located in the augmented reality equipment according to the point cloud information;
determining the display position and the display posture of the navigation mark in the augmented reality equipment based on the plane where the ground is located and the direction information;
and displaying the navigation identifier according to the display position and the display posture.
6. The method of claim 1, prior to obtaining connectivity information and distance information between a plurality of landmark nodes in a point cloud map, the method further comprising:
acquiring sensor data corresponding to the target environment through a terminal sensor;
utilizing an instant positioning and map building system to carry out map reconstruction processing on the sensor data to obtain an initial point cloud map;
determining a plurality of landmark nodes, and determining connectivity information and distance information among the landmark nodes;
and storing the initial point cloud map, the plurality of landmark nodes and connectivity information and distance information among the landmark nodes as the point cloud map.
7. The method of claim 6, prior to obtaining connectivity information and distance information between a plurality of landmark nodes in a point cloud map, the method further comprising:
and generating an adjacency matrix according to the connectivity information and the distance information among the plurality of road sign nodes.
8. A navigation device, comprising:
the acquisition unit is configured to acquire connectivity information and distance information among a plurality of landmark nodes in a point cloud map, and the point cloud map is constructed according to a target environment;
a determination unit configured to determine a start point and an end point from the plurality of landmark nodes;
the navigation unit is configured to generate a target navigation path according to the starting point, the end point, the connectivity information and the distance information;
a display unit configured to display a navigation identifier on the augmented reality device based on the target navigation path.
9. An augmented reality device comprising:
a memory for storing a computer program capable of running on the processor;
a processor for performing the method of any one of claims 1 to 7 when running the computer program.
10. A computer storage medium storing a computer program which, when executed by at least one processor, implements the method of any one of claims 1 to 7.
CN202111681859.XA 2021-12-30 2021-12-30 Navigation method, device, equipment and computer storage medium Pending CN114413919A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111681859.XA CN114413919A (en) 2021-12-30 2021-12-30 Navigation method, device, equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111681859.XA CN114413919A (en) 2021-12-30 2021-12-30 Navigation method, device, equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN114413919A true CN114413919A (en) 2022-04-29

Family

ID=81272411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111681859.XA Pending CN114413919A (en) 2021-12-30 2021-12-30 Navigation method, device, equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN114413919A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114808830A (en) * 2022-06-30 2022-07-29 湖北省高创公路工程咨询监理有限公司 Anti-freezing system for highway bridge
CN115460547A (en) * 2022-08-05 2022-12-09 浙江浩瀚能源科技有限公司 Beacon generation method, device and computer storage medium
CN115824248A (en) * 2023-02-15 2023-03-21 交通运输部规划研究院 Navigation method and device of pure electric heavy truck

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108362298A (en) * 2018-02-22 2018-08-03 青岛融汇通投资控股有限公司 Air navigation aid and device in area map
CN111595349A (en) * 2020-06-28 2020-08-28 浙江商汤科技开发有限公司 Navigation method and device, electronic equipment and storage medium
CN111664866A (en) * 2020-06-04 2020-09-15 浙江商汤科技开发有限公司 Positioning display method and device, positioning method and device and electronic equipment
CN113063424A (en) * 2021-03-29 2021-07-02 湖南国科微电子股份有限公司 Method, device, equipment and storage medium for intra-market navigation
CN113074736A (en) * 2021-03-24 2021-07-06 中国工商银行股份有限公司 Indoor navigation positioning method, equipment, electronic equipment, storage medium and product
CN113570664A (en) * 2021-07-22 2021-10-29 北京百度网讯科技有限公司 Augmented reality navigation display method and device, electronic equipment and computer medium
US20210341309A1 (en) * 2019-01-14 2021-11-04 Zhejiang Huaray Technology Co., Ltd. Systems and methods for route planning
CN113841100A (en) * 2019-05-27 2021-12-24 索尼集团公司 Autonomous travel control apparatus, autonomous travel control system, and autonomous travel control method
CN113847927A (en) * 2021-09-30 2021-12-28 国汽智控(北京)科技有限公司 Path generation method, device, equipment, storage medium and program product

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108362298A (en) * 2018-02-22 2018-08-03 青岛融汇通投资控股有限公司 Air navigation aid and device in area map
US20210341309A1 (en) * 2019-01-14 2021-11-04 Zhejiang Huaray Technology Co., Ltd. Systems and methods for route planning
CN113841100A (en) * 2019-05-27 2021-12-24 索尼集团公司 Autonomous travel control apparatus, autonomous travel control system, and autonomous travel control method
CN111664866A (en) * 2020-06-04 2020-09-15 浙江商汤科技开发有限公司 Positioning display method and device, positioning method and device and electronic equipment
CN111595349A (en) * 2020-06-28 2020-08-28 浙江商汤科技开发有限公司 Navigation method and device, electronic equipment and storage medium
CN113074736A (en) * 2021-03-24 2021-07-06 中国工商银行股份有限公司 Indoor navigation positioning method, equipment, electronic equipment, storage medium and product
CN113063424A (en) * 2021-03-29 2021-07-02 湖南国科微电子股份有限公司 Method, device, equipment and storage medium for intra-market navigation
CN113570664A (en) * 2021-07-22 2021-10-29 北京百度网讯科技有限公司 Augmented reality navigation display method and device, electronic equipment and computer medium
CN113847927A (en) * 2021-09-30 2021-12-28 国汽智控(北京)科技有限公司 Path generation method, device, equipment, storage medium and program product

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114808830A (en) * 2022-06-30 2022-07-29 湖北省高创公路工程咨询监理有限公司 Anti-freezing system for highway bridge
CN115460547A (en) * 2022-08-05 2022-12-09 浙江浩瀚能源科技有限公司 Beacon generation method, device and computer storage medium
CN115824248A (en) * 2023-02-15 2023-03-21 交通运输部规划研究院 Navigation method and device of pure electric heavy truck
CN115824248B (en) * 2023-02-15 2023-04-21 交通运输部规划研究院 Navigation method and device for pure electric heavy truck

Similar Documents

Publication Publication Date Title
US12101371B2 (en) Platform for constructing and consuming realm and object feature clouds
CN114413919A (en) Navigation method, device, equipment and computer storage medium
Arth et al. Wide area localization on mobile phones
Luo et al. Geotagging in multimedia and computer vision—a survey
CN104180814A (en) Navigation method in live-action function on mobile terminal, and electronic map client
Peng et al. CrowdGIS: Updating digital maps via mobile crowdsensing
CN110413719A (en) Information processing method and device, equipment, storage medium
CN112148742B (en) Map updating method and device, terminal and storage medium
CN104881860A (en) Positioning method and apparatus based on photographs
CN109558470B (en) Trajectory data visualization method and device
JP2021505978A (en) Storage and loading methods, devices, systems and storage media for visual self-location estimation maps
CN108711144A (en) augmented reality method and device
CN111859002B (en) Interest point name generation method and device, electronic equipment and medium
CN113393515B (en) Visual positioning method and system combining scene annotation information
CN110926478B (en) AR navigation route deviation rectifying method and system and computer readable storage medium
KR20190086032A (en) Contextual map view
CN107832331A (en) Generation method, device and the equipment of visualized objects
CN114061586A (en) Method and product for generating navigation path of electronic device
Ma et al. Mobile augmented reality based indoor map for improving geo-visualization
Aydın et al. ARCAMA-3D–a context-aware augmented reality mobile platform for environmental discovery
CN114792111A (en) Data acquisition method and device, electronic equipment and storage medium
Sharma et al. Navigation in AR based on digital replicas
WO2023246537A1 (en) Navigation method, visual positioning method, navigation map construction method, and electronic device
CN103167032A (en) Map-aided indoor positioning background service system
Li et al. BDLoc: Global localization from 2.5 D building map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination