CN112146656B - Indoor navigation visualization method based on augmented reality - Google Patents
Indoor navigation visualization method based on augmented reality Download PDFInfo
- Publication number
- CN112146656B CN112146656B CN202010913376.7A CN202010913376A CN112146656B CN 112146656 B CN112146656 B CN 112146656B CN 202010913376 A CN202010913376 A CN 202010913376A CN 112146656 B CN112146656 B CN 112146656B
- Authority
- CN
- China
- Prior art keywords
- poi
- information
- mobile terminal
- coordinate
- navigation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The invention belongs to the technical field of visualization, and discloses an indoor navigation visualization method based on augmented reality, which comprises the steps of acquiring indoor POI information positioned in a first range of a target point according to navigation target information; screening the indoor POI information according to sensor information of a mobile terminal and position information of the mobile terminal to obtain first POI information; acquiring basic equipment information of a mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; acquiring navigation road information according to the position information and the navigation target information of the mobile terminal, and calculating a rotation angle of the obtained guide arrow in real time; and rendering the guide arrow and the POI screen coordinate information in real time. According to the invention, the indoor navigation data is visualized, and the virtual navigation information is superposed on the real scene video stream, so that the virtual-real combined indoor navigation effect is realized, and the navigation information is more clear and visual.
Description
Technical Field
The invention relates to the technical field of visualization, in particular to an indoor navigation visualization method based on augmented reality.
Background
With the application and development of the technology related to the location information based on the user, the location service is gradually developed from a macro level to a micro level. At present, a Global Positioning System (GPS) is developed increasingly in outdoor positioning and navigation, and the outdoor travel requirement of people is basically met. Due to the shielding of buildings and the existence of multipath effect, the GPS can generate reflection, refraction, scattering, and the like in an indoor environment, and cannot provide accurate indoor position information. However, indoor environments are becoming more and more complex as society develops, and eighty percent of human activities occur indoors.
Most of the conventional navigation systems are outdoor navigation systems, and some of the conventional navigation systems include an indoor navigation function. However, these indoor navigation modules perform a visual representation of a superimposed two-dimensional plane on a map representation, and cannot provide accurate indoor positioning and navigation services in terms of functions, and help for a user to find a way is very limited. In an indoor complex environment, such as the interior of a large building, such as a shopping center, a convention and exhibition center, a library, a warehouse, an underground parking lot, and the like, the indoor structure is similar, the information provided by the vectorized navigation map is not clear enough, and the user is still required to distinguish directions and find paths indoors, which is a very difficult thing for the user without direction sense. Therefore, how to express the indoor navigation information as intuitively and accurately as possible to realize indoor navigation visualization is an urgent problem to be solved.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides an indoor navigation visualization method based on augmented reality.
The invention provides an indoor navigation visualization method based on augmented reality, which comprises the following steps:
step 1, acquiring indoor POI information positioned in a first range of a target point according to navigation target information;
step 2, screening the indoor POI information according to sensor information of a mobile terminal and position information of the mobile terminal to obtain first POI information, wherein the first POI information is POI information within a visual angle range of the mobile terminal;
step 3, acquiring basic equipment information of the mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating the POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; the POI screen coordinate information is screen mapping coordinate information obtained by overlaying the first POI information to a real video stream scene acquired by the mobile terminal;
step 4, obtaining navigation road information according to the position information of the mobile terminal and the navigation target information, and calculating a rotation angle of a guide arrow in real time;
and 5, rendering the guide arrow and the POI screen coordinate information in real time.
Preferably, the step 2 comprises the following substeps:
step 2.1, storing the current position information of the mobile terminal as a first Object, wherein the first Object comprises an x coordinate value and a y coordinate value; respectively storing each POI in the indoor POI information as a second Object, wherein the second Object comprises an x coordinate value, a y coordinate value, a distance to a current coordinate of the mobile terminal and a direction angle to a current position of the mobile terminal of the POI;
step 2.2, acquiring a gyroscope angle theta of the mobile terminal and a camera field angle alpha of the mobile terminal, and if preset conditions are met for each POI, saving data of the POI to a visual array and forming first POI information; the preset conditions are as follows:
wherein, theta i A direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information, d i The distance from the ith POI to the current position of the mobile terminal,is a distance threshold.
Preferably, the step 3 comprises the following substeps:
step 3.1, obtaining basic information of the equipment of the mobile terminal, wherein the basic information of the equipment comprises a screen pixel width W, a screen pixel height H and an initial gyroscope angle omega of the mobile terminal;
step 3.2, calculating initial screen coordinates of each POI in the first POI information according to the sensor information obtained in real time and by combining the basic information of the equipment, and adopting the following formula:
wherein x is i Is the initial x coordinate, y, of the ith POI in the first POI information i The initial y coordinate of the ith POI in the first POI information is obtained;
and 3.3, when the angle of the gyroscope of the mobile terminal does not exceed the field angle alpha of the camera of the mobile terminal, updating the screen coordinate of the POI according to the current angle of the gyroscope of the mobile terminal, wherein the updating adopts the following formula:
wherein x is i ' is the updated initial x coordinate of the ith POI, theta is the gyroscope angle of the mobile terminal, and theta is i A direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information;
and when the angle of the gyroscope of the mobile terminal exceeds the camera view angle alpha of the mobile terminal, or when the position movement of the mobile terminal exceeds a preset movement threshold, returning to the step 3.2.
Preferably, in the step 4, after the navigation road information is obtained, whether the mobile terminal moves on the navigation road is judged;
if so, calculating the rotation angle of the guiding arrow according to the current road direction and the angle of the gyroscope of the mobile terminal; if not, the rotation angle of the guiding arrow is calculated according to the direction angle of a vertical line segment formed by the current position of the moving end and the nearest road and the angle of a gyroscope of the moving end.
Preferably, said step 5 comprises the following sub-steps:
step 5.1, calculating the distance from each POI in the POI screen coordinate information to the current position of the mobile terminal, and obtaining the distance difference value R between the farthest POI and the nearest POI in the POI screen coordinate information according to the maximum distance and the minimum distance, wherein the following formula is adopted:
wherein d is i The distance between the ith POI and the current position of the mobile terminal is represented, and R represents the distance difference value between the farthest POI and the nearest POI in the POI screen coordinate information and the current position of the mobile terminal;
step 5.2, calculating the y coordinate of each POI in the POI screen coordinate information, and adopting the following formula:
h represents the height of a lower boundary of a displayed POI area on the interface, and H represents the pixel height of the POI display area;
step 5.3, performing real-time visual rendering on the POI screen coordinate information according to the y coordinate of each POI in the POI screen coordinate information obtained by calculation in the step 5.2 and the updated initial x coordinate of the ith POI obtained by calculation in the step 3.3, and optimizing the label covering problem;
step 5.4, calculating the distance from the current position of the mobile terminal to each road inflection point in the navigation road information, and finding out a road r closest to the current position of the mobile terminal 1 Calculating the current position of the mobile terminal to the road r 1 Direction angle alpha of 3 ;
Step 5.5, if the direction angle alpha 3 If the value is 0, judging that the mobile terminal moves on the navigation road, and adopting the following formula for guiding the correct road direction by the guide arrow:
α=α 1 -α 2
wherein alpha is the rotation angle of the guide arrow, alpha 1 For correct road direction, α 2 The current direction of the mobile terminal;
step 5.6,If the direction angle alpha 3 If not, judging that the mobile terminal deviates from the navigation road, and adopting the following formula by pointing to the correct road direction through the guide arrow:
α=α 3 -α 2
wherein α is a rotation angle of the guiding arrow, and the guiding arrow should point to a direction of a vertical line segment formed by the current position of the moving end and the nearest road.
One or more technical solutions provided in the embodiments of the present application have at least the following technical effects or advantages:
according to the indoor navigation visualization method based on augmented reality, indoor POI information located in a first range of a target point is obtained according to navigation target information; screening indoor POI information according to sensor information of a mobile terminal and position information of the mobile terminal to obtain first POI information, wherein the first POI information is POI information within a visual angle range of the mobile terminal; then, acquiring basic equipment information of the mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating the POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; the method comprises the steps that POI screen coordinate information and screen mapping coordinate information obtained by superimposing first POI information on a real video stream scene obtained by a mobile terminal are obtained; then, acquiring navigation road information according to the position information and the navigation target information of the mobile terminal, and calculating the rotation angle of the obtained guide arrow in real time; and finally, rendering the guide arrow and the POI screen coordinate information in real time. The method obtains video stream data of a camera of a mobile terminal (such as a mobile phone), obtains POI data, obtains mobile phone sensors and position information, screens the POI data, calculates the POI screen mapping coordinates, obtains navigation data according to a starting point and an end point, calculates a rotation angle of a guide arrow according to an indoor positioning track, and renders the guide arrow and a POI label in real time. Real coordinates of the POI labels are converted into mobile phone screen coordinates through a screen coordinate conversion method, real-time rendering of the POI labels is achieved, and virtual navigation information is overlaid on a real scene video stream through visualization of indoor navigation data, so that an indoor navigation effect of virtual-real combination is achieved. The method and the device are suitable for indoor navigation by utilizing the augmented reality technology, can make navigation information more clear and intuitive, are convenient for users to know, and provide a better navigation experience for the users.
Drawings
Fig. 1 is a schematic frame diagram of an indoor navigation visualization method based on augmented reality according to the present invention;
fig. 2 is a detailed flowchart of an indoor navigation visualization method based on augmented reality according to the present invention.
Detailed Description
In order to better understand the technical scheme, the technical scheme is described in detail in the following with reference to the attached drawings of the specification and specific embodiments.
The main idea of the technical scheme of the invention is as follows: the method has the advantages that the visual expression of the two-dimensional vector of the traditional map is changed, in order to help a user to find an indoor road clearly and definitely, the visualization of indoor navigation is carried out by adopting an augmented reality-based method, a real indoor scene is constructed by acquiring real video stream data, POI data and indoor navigation data are superposed, the visual effect of virtual-real combination is realized, indoor navigation information is expressed more straightly and accurately, the visual effect of real-time navigation is enriched, and more humanized and more accurate navigation experience is brought to the user.
The embodiment provides an augmented reality-based indoor navigation visualization method, which is shown in fig. 1 and 2 and comprises the following steps:
step 1, indoor POI information located in a first range of a target point is obtained according to navigation target information.
Specifically, indoor POI data is acquired from a server side.
And 2, screening the indoor POI information according to the sensor information of the mobile terminal and the position information of the mobile terminal to obtain first POI information, wherein the first POI information is POI information within the visual angle range of the mobile terminal.
And (4) carrying out POI data screening according to the mobile terminal sensor data, and screening out POI data in the user visual angle range.
Specifically, step 2 includes the following substeps:
step 2.1, storing the current position information of the mobile terminal as a first Object, wherein the first Object comprises an x coordinate value and a y coordinate value; and storing each POI point in the indoor POI information as a second Object, wherein the second Object comprises an x coordinate value, a y coordinate value, a distance to the current coordinate of the mobile terminal and a direction angle to the current position of the mobile terminal of the POI.
Step 2.2, acquiring a gyroscope angle theta of the mobile terminal and a camera angle alpha of the mobile terminal (such as a mobile phone), and if preset conditions are met for each POI, saving data of the POI to a visual array and forming first POI information; the preset conditions are as follows:
that is, if the preset condition is satisfied, the POI is considered to appear in the sight range, and the POI data is stored in the visual array.
Wherein, theta i The direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information, d i The distance from the ith POI to the current position of the mobile terminal,is a distance threshold.
Step 3, acquiring basic equipment information of the mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating the POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; and the POI screen coordinate information is screen mapping coordinate information obtained by superposing the first POI information on a real video stream scene acquired by the mobile terminal.
And calculating screen mapping coordinates of the longitude and latitude coordinates of the POI by combining basic parameters of the mobile end hardware equipment and the indoor positioning track coordinates, and obtaining initial screen coordinates of each POI. And acquiring the data of the mobile terminal sensor and the indoor positioning track in real time, and updating the screen coordinates of the POI in real time.
Specifically, step 3 includes the following substeps:
and 3.1, obtaining basic information of the mobile terminal, wherein the basic information of the mobile terminal comprises the width W of a screen pixel, the height H of the screen pixel, and the initial gyroscope angle omega of the mobile terminal when the system is opened.
Step 3.2, calculating initial screen coordinates of each POI in the first POI information according to the sensor data obtained in real time and by combining the basic information of the equipment, wherein the initial screen coordinates are shown in the following formula:
i.e. the initial y-coordinate of the POI is half the height of the cell phone pixels.
Wherein x is i Is the initial x coordinate, y, of the ith POI in the first POI information i Is the initial y coordinate of the ith POI in the first POI information.
And 3.3, when the angle (rotation) of the gyroscope of the mobile terminal does not exceed the angle of the camera field of the mobile terminal, the POI slides left and right along with the change of the angle of the gyroscope of the mobile terminal. Updating the screen coordinates of the POI (namely obtaining the gyroscope angle of the current mobile terminal and calculating the screen coordinates of the POI after sliding) according to the angle of the gyroscope of the current mobile terminal, wherein the updating adopts the following formula:
wherein x is i ' is the updated initial x coordinate of the ith POI, theta is the gyroscope angle of the mobile end, theta is i And the direction angle from the ith POI in the indoor POI information to the current coordinate of the mobile terminal.
When the angle of the gyroscope of the mobile terminal exceeds the field angle of the camera of the mobile terminal or the position of the mobile terminal moves beyond a certain threshold, the euclidean distance and the direction angle from the POI to the user need to be recalculated, and the step 3.2 is returned.
And 4, acquiring navigation road information according to the position information and the navigation target information of the mobile terminal, and calculating the rotation angle of the guiding arrow in real time.
Calculating current navigation data according to a starting point and an end point of navigation, and finding a road closest to the user position according to an indoor positioning coordinate; judging whether the user walks on the navigation road, if so, calculating the rotation angle of the guide arrow according to the current road direction and the angle of a gyroscope at the mobile end; if not, the rotation angle of the guiding arrow is calculated according to the direction angle of a vertical line segment formed by the current position (namely the positioning point) of the mobile terminal and the nearest road and the angle of a gyroscope of the mobile terminal.
And 5, rendering the guide arrow and the POI screen coordinate information in real time.
Specifically, step 5 includes the following substeps:
step 5.1, calculating the distance from each POI in the POI screen coordinate information to the current position of the mobile terminal, and obtaining the distance difference value R between the farthest POI and the nearest POI in the POI screen coordinate information according to the maximum distance and the minimum distance, wherein the following formula is adopted:
wherein d is i And R represents the distance difference between the farthest POI and the nearest POI in the POI screen coordinate information and the current position of the mobile terminal.
Step 5.2, calculating the y coordinate of each POI in the POI screen coordinate information, and adopting the following formula:
where H represents the height of the lower boundary of the POI display area on the interface, and H represents the pixel height of the POI display area.
And 5.3, performing real-time visual rendering on the POI screen coordinate information according to the y coordinate of each POI in the POI screen coordinate information obtained by calculation in the step 5.2 and the updated initial x coordinate of the ith POI obtained by calculation in the step 3.3, and optimizing the label covering problem.
Step 5.4, calculating the distance from the current position (namely the current positioning point coordinate) of the mobile terminal to each road inflection point (namely each navigation trigger point) in the navigation road information, and finding out a road r closest to the current position of the mobile terminal 1 . Calculating the current position of the mobile terminal to the road r 1 Direction angle alpha of 3 。
Step 5.5, if the direction angle alpha 3 If the value is 0, the mobile terminal is judged to move on the navigation road (namely, the user walks on the navigation road), and at the moment, only the correct road direction alpha needs to be guided 1 . Due to the initial angle of the guide arrow and the direction alpha of the moving end 2 And thus, the direction arrow need only be rotated by a certain angle α to point in the road heading direction. The following formula is used for guiding the right road direction by the guide arrow:
α=α 1 -α 2
wherein alpha is the rotation angle of the guide arrow, alpha 1 For correct road direction, α 2 Is the current direction of the mobile terminal.
Step 5.6, if the direction angle alpha 3 If not, determining that the mobile terminal deviates from the navigation road (i.e. the user has deviated from the correct road), and at this time, needing to direct the arrow to the correct road, where the arrow should be directed to the direction of the vertical line segment formed by the current position (i.e. the positioning point) of the mobile terminal and the nearest road. Pointing to the correct road direction by the guide arrow uses the following formula:
α=α 3 -α 2
wherein α is a rotation angle of the guide arrow. In summary, the invention provides an indoor navigation visualization method based on augmented reality, which is based on a real video stream scene, superimposes Point of Interest (POI) information and real-time guide information around a positioning Point, realizes fusion of the real scene and virtual navigation information, solves the confusion brought to users by two-dimensional plane visualization expression, enables users to acquire navigation information more intuitively, accurately and conveniently, and provides a more intuitive, rich and better human-computer interaction indoor navigation system for the users.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to examples, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.
Claims (2)
1. An indoor navigation visualization method based on augmented reality is characterized by comprising the following steps:
step 1, acquiring indoor POI information positioned in a first range of a target point according to navigation target information;
step 2, screening the indoor POI information according to sensor information of a mobile terminal and position information of the mobile terminal to obtain first POI information, wherein the first POI information is POI information within a visual angle range of the mobile terminal;
step 3, acquiring basic equipment information of the mobile terminal, acquiring sensor information of the mobile terminal in real time, and updating the POI screen coordinate information in real time by combining the basic equipment information of the mobile terminal; the POI screen coordinate information is screen mapping coordinate information obtained by overlaying the first POI information to a real video stream scene acquired by the mobile terminal;
step 4, obtaining navigation road information according to the position information of the mobile terminal and the navigation target information, and calculating a rotation angle of a guide arrow in real time;
step 5, rendering the guide arrow and the POI screen coordinate information in real time;
the step 2 comprises the following substeps:
step 2.1, storing the current position information of the mobile terminal as a first Object, wherein the first Object comprises an x coordinate value and a y coordinate value; respectively storing each POI in the indoor POI information as a second Object, wherein the second Object comprises an x coordinate value, a y coordinate value, a distance to a current coordinate of the mobile terminal and a direction angle to a current position of the mobile terminal of the POI;
step 2.2, acquiring a gyroscope angle theta of the mobile terminal and a camera field angle alpha of the mobile terminal, and if preset conditions are met for each POI, saving data of the POI to a visual array and forming first POI information; the preset conditions are as follows:
wherein, theta i The direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information, d i The distance from the ith POI to the current position of the mobile terminal,is a distance threshold;
the step 3 comprises the following substeps:
step 3.1, obtaining basic information of the equipment of the mobile terminal, wherein the basic information of the equipment comprises a screen pixel width W, a screen pixel height H and an initial gyroscope angle omega of the mobile terminal;
step 3.2, calculating initial screen coordinates of each POI in the first POI information according to the sensor information obtained in real time and by combining the basic information of the equipment, and adopting the following formula:
wherein x is i Is the initial x coordinate, y, of the ith POI in the first POI information i The initial y coordinate of the ith POI in the first POI information is obtained;
and 3.3, when the angle of the gyroscope of the mobile terminal does not exceed the field angle alpha of the camera of the mobile terminal, updating the screen coordinate of the POI according to the current angle of the gyroscope of the mobile terminal, wherein the updating adopts the following formula:
wherein x is i ' is the updated initial x coordinate of the ith POI, theta is the gyroscope angle of the mobile terminal, and theta is i The direction angle from the ith POI to the current coordinate of the mobile terminal in the indoor POI information is set;
when the angle of the gyroscope of the mobile terminal exceeds the camera field angle alpha of the mobile terminal, or when the position movement of the mobile terminal exceeds a preset movement threshold, returning to the step 3.2;
the step 5 comprises the following substeps:
step 5.1, calculating the distance from each POI in the POI screen coordinate information to the current position of the mobile terminal, and obtaining the distance difference value R between the farthest POI and the nearest POI in the POI screen coordinate information according to the maximum distance and the minimum distance, wherein the following formula is adopted:
wherein d is i The distance from the ith POI to the current position of the mobile terminal is represented, and R represents the distance from the farthest POI and the nearest POI in the POI screen coordinate information to the current position of the mobile terminalA distance difference of the locations;
step 5.2, calculating the y coordinate of each POI in the POI screen coordinate information, and adopting the following formula:
h represents the height of a lower boundary of a displayed POI area on the interface, and H represents the pixel height of the POI display area;
step 5.3, performing real-time visual rendering on the POI screen coordinate information according to the y coordinate of each POI in the POI screen coordinate information obtained by calculation in the step 5.2 and the updated initial x coordinate of the ith POI obtained by calculation in the step 3.3, and optimizing the label covering problem;
step 5.4, calculating the distance from the current position of the mobile terminal to each road inflection point in the navigation road information, and finding out a road r closest to the current position of the mobile terminal 1 Calculating the current position of the mobile terminal to the road r 1 Direction angle alpha of 3 ;
Step 5.5, if the direction angle alpha 3 If the value is 0, judging that the mobile terminal moves on the navigation road, and adopting the following formula for guiding the correct road direction by the guide arrow:
α=α 1 -α 2
wherein alpha is the rotation angle of the guide arrow, alpha 1 For correct road direction, α 2 The current direction of the mobile terminal;
step 5.6, if the direction angle alpha 3 If not, judging that the mobile terminal deviates from the navigation road, and adopting the following formula by pointing to the correct road direction through the guide arrow:
α=α 3 -α 2
wherein α is a rotation angle of the guiding arrow, and the guiding arrow should point to a direction of a vertical line segment formed by the current position of the moving end and the nearest road.
2. The method for visualizing indoor navigation based on augmented reality of claim 1, wherein in step 4, after the navigation road information is obtained, it is determined whether the mobile terminal moves on the navigation road;
if so, calculating the rotation angle of the guiding arrow according to the current road direction and the angle of the gyroscope of the mobile terminal; if not, calculating the rotation angle of the guide arrow according to the direction angle of a vertical line segment formed by the current position of the moving end and the nearest road and the angle of a gyroscope of the moving end.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010913376.7A CN112146656B (en) | 2020-09-03 | 2020-09-03 | Indoor navigation visualization method based on augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010913376.7A CN112146656B (en) | 2020-09-03 | 2020-09-03 | Indoor navigation visualization method based on augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112146656A CN112146656A (en) | 2020-12-29 |
CN112146656B true CN112146656B (en) | 2023-02-17 |
Family
ID=73889244
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010913376.7A Active CN112146656B (en) | 2020-09-03 | 2020-09-03 | Indoor navigation visualization method based on augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112146656B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11785430B2 (en) | 2021-04-13 | 2023-10-10 | Research Foundation Of The City University Of New York | System and method for real-time indoor navigation |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104101354A (en) * | 2013-04-15 | 2014-10-15 | 北京四维图新科技股份有限公司 | Method, apparatus and system for optimizing POI guiding coordinates in map data |
CN105095314A (en) * | 2014-05-22 | 2015-11-25 | 北京四维图新科技股份有限公司 | Point of interest (POI) marking method, terminal, navigation server and navigation system |
CN105371847A (en) * | 2015-10-27 | 2016-03-02 | 深圳大学 | Indoor live-action navigation method and system |
CN107240156A (en) * | 2017-06-07 | 2017-10-10 | 武汉大学 | A kind of outdoor augmented reality spatial information of high accuracy shows system and method |
CN109974733A (en) * | 2019-04-02 | 2019-07-05 | 百度在线网络技术(北京)有限公司 | POI display methods, device, terminal and medium for AR navigation |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110153198A1 (en) * | 2009-12-21 | 2011-06-23 | Navisus LLC | Method for the display of navigation instructions using an augmented-reality concept |
US8265866B2 (en) * | 2010-12-15 | 2012-09-11 | The Boeing Company | Methods and systems for augmented navigation |
KR20150126289A (en) * | 2014-05-02 | 2015-11-11 | 한국전자통신연구원 | Navigation apparatus for providing social network service based on augmented reality, metadata processor and metadata processing method in the augmented reality navigation system |
-
2020
- 2020-09-03 CN CN202010913376.7A patent/CN112146656B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104101354A (en) * | 2013-04-15 | 2014-10-15 | 北京四维图新科技股份有限公司 | Method, apparatus and system for optimizing POI guiding coordinates in map data |
CN105095314A (en) * | 2014-05-22 | 2015-11-25 | 北京四维图新科技股份有限公司 | Point of interest (POI) marking method, terminal, navigation server and navigation system |
CN105371847A (en) * | 2015-10-27 | 2016-03-02 | 深圳大学 | Indoor live-action navigation method and system |
CN107240156A (en) * | 2017-06-07 | 2017-10-10 | 武汉大学 | A kind of outdoor augmented reality spatial information of high accuracy shows system and method |
CN109974733A (en) * | 2019-04-02 | 2019-07-05 | 百度在线网络技术(北京)有限公司 | POI display methods, device, terminal and medium for AR navigation |
Non-Patent Citations (3)
Title |
---|
侯晓宁等.增强现实电子地图应用模式研究.《测绘科学技术学报》.2016,第33卷(第06期),第639-643页. * |
应申等.基于Android的室内增强现实系统的实现.《地理信息世界》.2016,第23卷(第01期),第93-98页. * |
程雄.增强现实技术在iPhone平台室内导航系统中的研究与应用.《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》.2013,(第S2期), * |
Also Published As
Publication number | Publication date |
---|---|
CN112146656A (en) | 2020-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10760922B2 (en) | Augmented reality maps | |
US20190180512A1 (en) | Method for Representing Points of Interest in a View of a Real Environment on a Mobile Device and Mobile Device Therefor | |
CN108474666B (en) | System and method for locating a user in a map display | |
EP2643822B1 (en) | Guided navigation through geo-located panoramas | |
US9996982B2 (en) | Information processing device, authoring method, and program | |
US20130162665A1 (en) | Image view in mapping | |
EP3596588B1 (en) | Gradual transitioning between two-dimensional and three-dimensional augmented reality images | |
US12092479B2 (en) | Map feature identification using motion data and surfel data | |
CN113570664B (en) | Augmented reality navigation display method and device, electronic equipment and computer medium | |
US11290705B2 (en) | Rendering augmented reality with occlusion | |
KR20190086032A (en) | Contextual map view | |
Narzt et al. | A new visualization concept for navigation systems | |
CN112146656B (en) | Indoor navigation visualization method based on augmented reality | |
US20230134475A1 (en) | Viewport system for dynamically framing of a map based on updating data | |
US11334232B1 (en) | Systems and methods for interactive maps | |
US10930079B1 (en) | Techniques for displaying augmentations that represent cadastral lines and other near-ground features | |
CN112558008A (en) | Navigation method, system, equipment and medium based on optical communication device | |
CN113452842A (en) | Flight AR display method, system, computer equipment and storage medium | |
Stroila et al. | Route visualization in indoor panoramic imagery with open area maps | |
US20230384871A1 (en) | Activating a Handheld Device with Universal Pointing and Interacting Device | |
KR101924491B1 (en) | System and Method for Clipping Augmented Reality Object based on GIS data | |
Gandhi et al. | A* Algorithm and Unity for Augmented Reality-based Indoor Navigation. | |
Mower | The augmented scene: integrating the map and the environment | |
Khare et al. | Amalgam Version of Itinerant Augmented Reality | |
Patel et al. | Improving Navigation for Street Data Using Mobile Augmented Reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |