CN113484889A - Immersive navigation system based on augmented reality and satellite positioning of mobile terminal - Google Patents

Immersive navigation system based on augmented reality and satellite positioning of mobile terminal Download PDF

Info

Publication number
CN113484889A
CN113484889A CN202110768239.3A CN202110768239A CN113484889A CN 113484889 A CN113484889 A CN 113484889A CN 202110768239 A CN202110768239 A CN 202110768239A CN 113484889 A CN113484889 A CN 113484889A
Authority
CN
China
Prior art keywords
module
augmented reality
navigation system
positioning
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110768239.3A
Other languages
Chinese (zh)
Inventor
士涛
刘琦
王茂松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202110768239.3A priority Critical patent/CN113484889A/en
Publication of CN113484889A publication Critical patent/CN113484889A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Abstract

The invention relates to the technical field of navigation, in particular to an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal; the system comprises an outdoor augmented reality navigation system and an indoor augmented reality navigation system; the system converts the satellite positioning coordinates and the interest point coordinates into virtual coordinates in an outdoor environment, finally places virtual road signs and virtual routes on a real road by using an augmented reality technology to achieve the effect of virtual-real fusion, integrates the functions of indoor map information acquisition, indoor navigation positioning and augmented reality navigation in an indoor environment, realizes area-level positioning by using a WiFi fingerprint matching positioning algorithm, and realizes high-precision indoor positioning by combining a visual matching positioning method.

Description

Immersive navigation system based on augmented reality and satellite positioning of mobile terminal
Technical Field
The invention relates to the technical field of navigation, in particular to an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal.
Background
With the rapid growth of the internet and smart phones, location-based services are becoming more and more popular, and the demand for navigation is also increasing. In recent years, with the rapid development of AR technology, the combination of augmented reality technology and Location Based Services (LBS) has become a research hotspot. By utilizing the augmented reality technology, a user can go out in an intuitive navigation mode, and the navigation efficiency is greatly improved. The problem that a user is difficult to find a target building in navigation is solved, and the defects of a traditional navigation mode are effectively overcome.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal, the system converts a satellite positioning coordinate and an interest point coordinate into a virtual coordinate in an outdoor environment, and finally places a virtual road sign and a virtual route on a real road by using an augmented reality technology to achieve the effect of virtual-real fusion, realizes area-level positioning by using a WiFi fingerprint matching positioning algorithm in an indoor environment, and realizes high-precision indoor positioning by combining a visual matching positioning method.
In order to achieve the purpose, the invention provides the following technical scheme:
immersive navigation system based on augmented reality and satellite positioning of a mobile terminal, comprising:
the outdoor augmented reality navigation system firstly acquires user positioning information, fuses attitude data and height data of user equipment, and further superimposes virtual navigation information and a real scene.
The indoor augmented reality navigation system is used for fusing virtual navigation information and video stream images in real time and displaying virtual and real fused navigation information by establishing a visual point cloud map and a WiFi fingerprint identification library and calculating the position and attitude information of user equipment.
The invention is further configured to: the outdoor augmented reality navigation system comprises a satellite positioning module and a real scene rendering module, wherein,
the satellite positioning module is used for acquiring the position information of the current user.
The real scene rendering module is used for displaying the virtual navigation path, drawing the coordinate data of the interest point in real time and tracking the posture of the mobile terminal, and is electrically connected with the satellite positioning module.
The invention is further configured to: the outdoor augmented reality navigation system further comprises a high-grade map data module and a sensor module, wherein,
the high-resolution map data module is used for map searching and path planning and is electrically connected with the live-action rendering module.
The sensor module acquires the attitude and the azimuth information of the mobile phone according to the state of the mobile phone, and is electrically connected with the Gaode map data module.
The invention is further configured to: the sensor module comprises a mobile phone inertial sensor, an acceleration sensor and a magnetometer.
The invention is further configured to: the indoor augmented reality navigation system comprises a visual point cloud map acquisition module and a WLAN fingerprint information acquisition and positioning module, wherein,
the visual point cloud map acquisition module is used for acquiring indoor environment information in real time and establishing an indoor visual map.
The WLAN fingerprint information acquisition and positioning module establishes a fingerprint identification database by acquiring fingerprint information, and the WLAN fingerprint information acquisition and positioning module is electrically connected with the visual point cloud map acquisition module.
The invention is further configured to: the indoor augmented reality navigation system also comprises a characteristic point detection module and a virtual-real rendering module, wherein,
the characteristic point detection module is used for selecting visual difference points in the image, tracking and updating the visual difference points in real time, and obtaining stable characteristic points through an algorithm, and is electrically connected with the WLAN fingerprint information acquisition and positioning module.
The virtual and real rendering module is used for fusing the virtual navigation information and the video stream images in real time and displaying the virtual navigation information and the video stream images on a mobile phone screen, and the virtual and real rendering module is electrically connected with the characteristic point detection module.
The invention is further configured to: the visual difference points refer to points with large differences in brightness, color and gray level in the image.
The invention is further configured to: the algorithm is a Wi-Fi positioning algorithm, preferably a signal strength based location fingerprinting positioning algorithm.
Advantageous effects
Compared with the known public technology, the technical scheme provided by the invention has the following beneficial effects:
(1) the system converts the satellite positioning coordinates and the interest point coordinates into virtual coordinates in an outdoor environment, and finally places the virtual road signs and the virtual routes on a real road by using an augmented reality technology to achieve the effect of virtual-real fusion.
(2) The system integrates the functions of indoor map information acquisition, indoor navigation positioning and augmented reality navigation in an indoor environment, realizes area-level positioning by utilizing a WiFi fingerprint matching positioning algorithm, and realizes high-precision indoor positioning by combining a visual matching positioning method.
Drawings
FIG. 1 is a diagram of indoor and outdoor systems for an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal;
fig. 2 is a schematic diagram of the satellite positioning principle in an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal;
FIG. 3 is a diagram of the correspondence between the coordinates of the sensor and the coordinates of the screen of the mobile phone in the immersive navigation system based on the augmented reality and satellite positioning of the mobile terminal;
FIG. 4 is a schematic diagram of a spatial geodetic coordinate system in an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal;
FIG. 5 is a schematic diagram of a conversion from a geodetic coordinate system to a camera coordinate system in an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal;
FIG. 6 is a schematic diagram of the conversion of three-dimensional cameras to two-dimensional screen coordinates in an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal;
FIG. 7 is a schematic diagram of a pixel coordinate system in an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal;
FIG. 8 is a schematic diagram of a coordinate system transformation process in an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal;
fig. 9 is a schematic diagram of a positioning algorithm in an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal.
The reference numbers in the figures illustrate:
1. an outdoor augmented reality navigation system; 101. a satellite positioning module; 102. a live-action rendering module; 103. a Gade map data module; 104. a sensor module; 2. an indoor augmented reality navigation system; 201. a visual point cloud map acquisition module; 202. WLAN fingerprint information acquisition and positioning module; 203. a feature point detection module; 204. and a virtual and real rendering module.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention; it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments, and all other embodiments obtained by those skilled in the art without any inventive work are within the scope of the present invention.
In the description of the present invention, it should be noted that the terms "upper", "lower", "inner", "outer", "top/bottom", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of description and simplification of description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus should not be construed as limiting the present invention. Furthermore, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "disposed," "sleeved/connected," "connected," and the like are to be construed broadly, e.g., "connected," which may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; the two components can be directly connected or indirectly connected through an intermediate medium, and the two components can be communicated with each other; the specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Example 1:
referring to fig. 1-9, an immersive navigation system based on augmented reality and satellite positioning of a mobile terminal includes:
in the outdoor augmented reality navigation system 1, the outdoor augmented reality navigation system 1 firstly obtains user positioning information, fuses attitude data and Gauss data of user equipment, and further superimposes virtual navigation information and a real scene.
The indoor augmented reality navigation system 2 is characterized in that the indoor augmented reality navigation system 2 calculates the position and posture information of the user equipment by establishing a visual point cloud map and a WiFi fingerprint identification library, fuses virtual navigation information and video stream images in real time and displays the navigation information fused in a virtual mode and a real mode.
The outdoor augmented reality navigation system 1 comprises a satellite positioning module 101 and a real scene rendering module 102, wherein,
the satellite positioning module 101 is used for acquiring the current user position information.
The real-scene rendering module 102 is configured to display a virtual navigation path, draw coordinate data of a point of interest in real time, and track a posture of the mobile terminal, and the real-scene rendering module 102 is electrically connected to the satellite positioning module 101.
The outdoor augmented reality navigation system 1 further comprises a highrise map data module 103 and a sensor module 104, wherein,
the high-resolution map data module 103 is used for map search and path planning, and the high-resolution map data module 103 is electrically connected with the live-action rendering module 102.
The sensor module 104 acquires the attitude and the azimuth information of the mobile phone according to the state of the mobile phone, and the sensor module 104 is electrically connected with the Gade map data module 103.
The sensor module 104 includes a handset inertial sensor, an acceleration sensor, and a magnetometer.
In this embodiment, a receiver is used as the satellite positioning module 101, during the positioning process, a satellite continuously sends a signal to the receiver, and the receiver performs positioning by receiving a satellite signal in real time, as shown in fig. 2, 24 GPS satellites are uniformly distributed in a network, and ideally, 3 satellites can determine the position information of the user, and the receiver solves the position coordinates by establishing an equation with each satellite. Due to the clock error, a fourth satellite will be introduced as a correction satellite, thereby forming four equations. The location information of the destination may be parsed. The system of equations used is as follows:
Figure BDA0003152734930000071
in the formula, xi,yi,zi(i-1, 2,3,4) represents the spatial coordinate system of the satellite at time t, respectively, and vtj(i ═ 1,2,3,4) denotes the satellite clock difference provided by the satellite ephemeris, di(i ═ 1,2,3,4) denotes the distance from the observation point to the satellite, c is data transmitted by the GPS signal in the form of radio waves, x, y, z are positioning information for acquiring the actual position, v ist0Is the clock error of the receiver.
The arranged mobile phone inertial sensor is used for measuring the azimuth angle, the pitch angle and the inclination angle of the mobile phone, and when the mobile phone is in a state from inside to outside or is held in a longitudinal or transverse operation mode, the relative position of the origin of coordinates of a sensor coordinate system and the mobile phone cannot be changed no matter how the state of the equipment is changed. The y-axis direction is correspondingly directed to the top and bottom directions of the mobile phone, and so on, no matter what state the actual position and direction corresponding to the mobile phone are, the x-axis coordinate system of the mobile phone screen always points from left to right, and the direction of the z-axis is perpendicular to the direction of the screen.
The schematic direction of the x and y axes is shown in fig. 3. In summary, when the mobile phone is held in the portrait orientation, the sensor coordinate system and the screen coordinate system correspond to each other and are consistent with each other, but when the mobile phone is held in the landscape orientation, the sensor coordinate system and the screen coordinate system cannot be aligned with each other.
The data of the acceleration sensor and the magnetometer are represented by vectors. If the handheld device is located in the left-side downward direction, the sensing numerical value borne by the X axis of the corresponding device can be close to the numerical value of the gravitational acceleration, if the screen direction of the device is vertically upward, the sensing numerical value of the corresponding Z axis is close to the numerical value of the gravitational acceleration, if the device is vertical, no acceleration exists in the X axis and the Z axis, and all the borne gravitational acceleration numerical values are borne on the Y axis. The value of the magnetometer represents the magnetic field strength of the surrounding environment and is used for measuring the magnetic field in the directions of the surrounding three coordinate axes, and the direction of the magnetic field is considered to be the north-south direction without considering the influence of the environment.
In this embodiment, the position information of the user and the target is obtained from the GNSS module, and the current coordinates of the user is [ Latuser,Lonuser,Altuser]TThe longitude and latitude coordinates of the target are shown as [ Latpoi,Lonpoi,Altpoi]TThe distance between the two coordinates is calculated as follows:
Δlon=lonpoi-lonuser
Δlat=latpoi-latuser
Figure BDA0003152734930000081
Figure BDA0003152734930000082
d=earthRadius*c
where earthRadius represents the radius of the earth and d represents the distance between the current position and the target position.
During augmented reality navigation or traditional map navigation, they are both displayed on the navigation interface. When presenting data information of a point of interest in the course of augmented reality navigation, first, geographical coordinates represented by latitude and longitude are converted into geodetic coordinates. The following is the conversion process:
X=(N+alt)cos(lat)cos(lon)
Y=(N+alt)cos(lat)sin(lon)
Z=(N(1-e2)+alt)sin(lat)
wherein the content of the first and second substances,
Figure BDA0003152734930000091
Figure BDA0003152734930000092
in the equation, X, Y, Z is a coordinate value in a geodetic coordinate system, where a is a major axis (a is 6378.137km) and b is a minor axis (b is 6356.752314km), and the coordinate system obtained by the conversion is a spatial geodetic coordinate system.
As shown in fig. 4, the spatial geodetic coordinate system is characterized in that the origin is located at the center of an elliptic sphere corresponding to the whole earth, the Z axis points to the north pole of the earth axis, the X axis points to the intersection point of the meridian plane and the equator, and the Y axis and the X axis form an included angle of 90 degrees.
When the geodetic coordinate system is converted into the carrier coordinate system, the geodetic coordinate system needs to rotate around an axis X, Y, Z for a certain angle, and the geodetic coordinate system needs to rotate around a Y axis for a certain angle
Figure BDA0003152734930000093
The transformation is:
Figure BDA0003152734930000101
in the above matrix
Figure BDA0003152734930000102
The value of (d) is the coordinate value after rotation. The transformation after rotating by w degrees around the X axis is:
Figure BDA0003152734930000103
wherein R iswAnd (3) rotating a rotation transformation matrix around the axis, and finally rotating the rotation transformation matrix around the Z axis by an angle theta, wherein the rotation transformation process is as follows:
Figure BDA0003152734930000104
therefore, there are:
Figure BDA0003152734930000105
therefore, the rotation matrix R is:
Figure BDA0003152734930000106
r is a rotation matrix for converting the geodetic coordinate system to the user equipment camera coordinate system, the conversion process being illustrated in FIG. 5, PwAnd PcRepresent the same point of different significance, wherein PwAnd PcThe points are represented by the geodetic and camera coordinate systems, respectively, and the three axes of the world geodetic coordinate system are represented by U, V, W, with the three axes of the camera coordinate system generally being chosen to be represented by X, Y, Z.
The geodetic coordinate is adopted to represent the target position, meanwhile, the camera coordinate system is adopted to represent the coordinate of the mobile equipment, the geodetic coordinate system is connected with the camera coordinate system, after the geodetic coordinate system is converted by a rotation matrix R, three axes of the two coordinate systems are aligned, but the coordinate needs to be translated, the camera coordinate where the equipment is located is obtained, the corresponding geodetic coordinate can be obtained by a conversion mode, and a formula corresponding to a matrix of the movement and rotation operations is as follows:
Figure BDA0003152734930000114
and coinciding the coordinate origin points of the two coordinate systems by using a translation vector t, and describing the corresponding relation of the two coordinate systems by using a parameter rotation matrix R and the translation vector t, wherein R is a 3-x 3 matrix, and t is a three-dimensional translation vector. In order to be able to display coordinate points on a two-dimensional screen, a conversion from three dimensions to two dimensions needs to be achieved.
As shown in fig. 6, O-XYZ is a camera coordinate system, O-XY is a plane where an image is located, coordinates of any point P in a scene on the image plane after perspective projection are changed into P, and a distance from the point O to the projection plane is f, i.e., a focal length, and from fig. 7, two similar triangles can be seen. According to the theorem of similar triangles, the following conclusions are drawn:
Figure BDA0003152734930000111
Figure BDA0003152734930000112
in fig. 6, a point P on a two-dimensional plane is represented by a vector P ═ x, y, and P 'is a point in three-dimensional coordinates, and is represented by P ═ x', y ', z', according to the homogeneous coordinate principle, there are:
Figure BDA0003152734930000113
the above formula can be expressed in matrix form as:
Figure BDA0003152734930000123
the above formula implements a coordinate transformation from three dimensions to two dimensions. The unit of the projected point p 'is mm, so it is necessary to convert the point p' from the plane coordinate system to the pixel coordinate system and to convert it to a point in the digital image.
The origin position of the pixel coordinate system is Ouv, the midpoint of the imaging plane is O, and the conversion formula of the two coordinates is as follows:
Figure BDA0003152734930000121
Figure BDA0003152734930000122
where dx and dy indicate that the values represented by each row and each column are in millimeters. After the above-mentioned series of transformation steps, the points in the geodetic coordinate system are successfully transformed into the coordinate positions of the mobile phone pixels, the transformation process is as shown in fig. 8, the points of interest are transformed into the world coordinate system first, then into the camera coordinate system, then into the screen coordinate system, and finally into the pixel coordinates.
Example 2:
on the basis of the embodiment 1, the indoor augmented reality navigation system 2 includes a visual point cloud map collecting module 201 and a WLAN fingerprint information collecting and positioning module 202, wherein,
the visual point cloud map acquisition module 201 is used for acquiring indoor environment information in real time and establishing an indoor visual map.
The WLAN fingerprint information collecting and positioning module 202 establishes a fingerprint identification database by collecting fingerprint information, and the WLAN fingerprint information collecting and positioning module 202 is electrically connected with the visual point cloud map collecting module 201.
The indoor augmented reality navigation system 2 further comprises a feature point detection module 203 and a virtual-real rendering module 204, wherein,
the feature point detection module 203 is configured to select visual difference points in the image, track and update the visual difference points in real time, and obtain stable feature points through an algorithm, and the feature point detection module 203 is electrically connected to the WLAN fingerprint information acquisition and positioning module 202.
The virtual-real rendering module 204 is configured to fuse the virtual navigation information and the video stream image in real time, and display the virtual navigation information and the video stream image on a screen of the mobile phone, and the virtual-real rendering module 204 is electrically connected to the feature point detecting module 203.
The visual difference points refer to points with large differences in brightness, color and gray level in the image.
The algorithm is a Wi-Fi positioning algorithm, and the WiFi positioning algorithm refers to the following steps: signal strength based location fingerprinting algorithms (whether the algorithm here needs to be specific or broad, since WIFI positioning algorithms are numerous, it is preferred to use signal strength based location fingerprinting algorithms).
In this embodiment, in an indoor environment, in order to realize Wi-Fi accurate positioning, a hotspot (AP) is placed indoors, and since each wireless AP has only one MAC address, a Wi-Fi positioning algorithm adopted by the indoor augmented reality navigation system 2 is a signal strength-based location fingerprint positioning algorithm (RSSI).
The principle based on signal received strength (RSSI) is: the value of the received signal strength is represented by a power indicator of the received signal. The signal strength will be reduced by the increase of the transmission distance. The relationship between signal strength and transmission distance is as follows:
Figure BDA0003152734930000141
the RSSI is a received signal strength value, is represented by P, has a unit of dBm, and represents the strength of a signal received by an instrument between a node sending the signal and a node receiving the signal when the signal is sent out; in addition, the distance between the signal transmitting and receiving nodes is denoted by d. P0The unit of (b) is also dBm, which means that the RSSI value of the transmitted signal is a relative value, and is defined as 1m, in general, compared with the RSSI value of the signal received by the receiving node and compared with the RSSI value of the signal received by the fixed-distance node. Defining a path loss exponent of npIt is used to characterize the degree of attenuation of a signal during propagation. The degree of attenuation is closely related to the surrounding environment. In Table 1, npTypical values of (a).
Table 1 path loss exponent reference table
Figure BDA0003152734930000142
Assuming that a is the signal strength after the signal propagates for 1m, the wireless signal transmission model can be rewritten as:
P=A-10nplog10(d)
as can be seen from the above formula, the RSSI value is related to the following parameters, i.e., the signal propagation distance d, A and the path loss exponent np. If knowing A and npAnd the value of P can be measured, and d can be obtained by calculation.
The RSSI is calculated based on the signal strength, and the calculation principle is as follows: firstly, a wireless terminal receives signals, the distance between a starting node and an end node is calculated on the basis of knowing the signal strength, and the RSSI value from the starting node to an unknown node is estimated.
Therefore, the coordinates of the unknown node are obtained by the formula calculation, as shown in fig. 9.
According to the relationship between the RSSI value and the propagation distance:
Figure BDA0003152734930000151
Figure BDA0003152734930000152
Figure BDA0003152734930000153
the coordinate (x, y) of the point M can be solved by solving the equation set, and the unknown point M can be positioned.
Compared with other positioning algorithms, the Wi-Fi positioning algorithm adopted by the invention is easier to realize, and the positioning time is greatly shortened because the requirement on the hardware configuration of the system is not high and the transmission interactive data is relatively less.
The system converts the satellite positioning coordinates and the interest point coordinates into virtual coordinates in an outdoor environment, finally places virtual road signs and virtual routes on a real road by using an augmented reality technology to achieve the effect of virtual-real fusion, integrates the functions of indoor map information acquisition, indoor navigation positioning and augmented reality navigation in an indoor environment, realizes area-level positioning by using a WiFi fingerprint matching positioning algorithm, and realizes high-precision indoor positioning by combining a visual matching positioning method.
Parts of the invention may be implemented in hardware, software, firmware or a combination thereof, and in the above embodiments, the steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (8)

1. Immersive navigation system based on augmented reality and satellite positioning of a mobile terminal, comprising:
the outdoor augmented reality navigation system (1) firstly acquires user positioning information, fuses attitude data and Gade data of user equipment, and further superimposes virtual navigation information and a real scene;
the indoor augmented reality navigation system (2) is used for fusing virtual navigation information and video stream images in real time and displaying virtual and real fused navigation information by establishing a visual point cloud map and a WiFi fingerprint identification library and calculating the position and attitude information of user equipment.
2. The mobile terminal-based augmented reality and satellite positioning immersive navigation system of claim 1, wherein the outdoor augmented reality navigation system (1) comprises a satellite positioning module (101) and a real scene rendering module (102), wherein,
the satellite positioning module (101) is used for acquiring the current user position information;
the real scene rendering module (102) is used for displaying a virtual navigation path, drawing coordinate data of an interest point in real time and tracking the posture of the mobile terminal, and the real scene rendering module (102) is electrically connected with the satellite positioning module (101).
3. The mobile terminal-based augmented reality and satellite positioning immersive navigation system of claim 2, wherein the outdoor augmented reality navigation system (1) further comprises a Gade map data module (103) and a sensor module (104), wherein,
the high-resolution map data module (103) is used for map searching and path planning, and the high-resolution map data module (103) is electrically connected with the live-action rendering module (102);
the sensor module (104) acquires the posture and the azimuth information of the mobile phone according to the state of the mobile phone, and the sensor module (104) is electrically connected with the Gade map data module (103).
4. The immersive navigation system based on augmented reality and satellite positioning of claim 3, wherein the sensor module (104) comprises a cell phone inertial sensor, an acceleration sensor and a magnetometer.
5. The immersive mobile terminal-based augmented reality and satellite positioning navigation system of claim 1, wherein the indoor augmented reality navigation system (2) comprises a visual point cloud map acquisition module (201) and a WLAN fingerprint information acquisition and positioning module (202), wherein,
the visual point cloud map acquisition module (201) is used for acquiring indoor environment information in real time and establishing an indoor visual map;
the WLAN fingerprint information acquisition and positioning module (202) establishes a fingerprint identification database by acquiring fingerprint information, and the WLAN fingerprint information acquisition and positioning module (202) is electrically connected with the visual point cloud map acquisition module (201).
6. The mobile terminal-based augmented reality and satellite positioning immersive navigation system of claim 4, wherein the indoor augmented reality navigation system (2) further comprises a feature point detection module (203) and a virtual-real rendering module (204), wherein,
the characteristic point detection module (203) is used for selecting visual difference points in the image, tracking and updating the visual difference points in real time, obtaining stable characteristic points through an algorithm, and the characteristic point detection module (203) is electrically connected with the WLAN fingerprint information acquisition and positioning module (202);
the virtual and real rendering module (204) is used for fusing virtual navigation information and video stream images in real time and displaying the virtual navigation information and the video stream images on a mobile phone screen, and the virtual and real rendering module (204) is electrically connected with the characteristic point detection module (203).
7. The immersive navigation system for augmented reality and satellite positioning based on a mobile terminal of claim 6, wherein the visual difference points are points with large differences in brightness, color and gray scale in the image.
8. The immersive mobile terminal-based augmented reality and satellite positioning system of claim 6, wherein the algorithm is a Wi-Fi positioning algorithm.
CN202110768239.3A 2021-07-07 2021-07-07 Immersive navigation system based on augmented reality and satellite positioning of mobile terminal Pending CN113484889A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110768239.3A CN113484889A (en) 2021-07-07 2021-07-07 Immersive navigation system based on augmented reality and satellite positioning of mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110768239.3A CN113484889A (en) 2021-07-07 2021-07-07 Immersive navigation system based on augmented reality and satellite positioning of mobile terminal

Publications (1)

Publication Number Publication Date
CN113484889A true CN113484889A (en) 2021-10-08

Family

ID=77941903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110768239.3A Pending CN113484889A (en) 2021-07-07 2021-07-07 Immersive navigation system based on augmented reality and satellite positioning of mobile terminal

Country Status (1)

Country Link
CN (1) CN113484889A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014020863A (en) * 2012-07-17 2014-02-03 Zenrin Datacom Co Ltd Portable navigation device
CN103913174A (en) * 2012-12-31 2014-07-09 深圳先进技术研究院 Navigation information generation method and system, mobile client and server
CN205333084U (en) * 2015-12-08 2016-06-22 悠泊物联网科技(上海)有限公司 Car navigation system based on AR technique
US20160284125A1 (en) * 2015-03-23 2016-09-29 International Business Machines Corporation Path visualization for augmented reality display device based on received data and probabilistic analysis
CN107014395A (en) * 2017-03-31 2017-08-04 武汉大学 Directive property panoramic navigation system based on virtual reality technology
CN110843674A (en) * 2019-11-22 2020-02-28 深圳晨芯时代科技有限公司 On-vehicle display module assembly system based on AR augmented reality technique
CN111311756A (en) * 2020-02-11 2020-06-19 Oppo广东移动通信有限公司 Augmented reality AR display method and related device
CN112325883A (en) * 2020-10-19 2021-02-05 湖南大学 Indoor positioning method for mobile robot with WiFi and visual multi-source integration

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014020863A (en) * 2012-07-17 2014-02-03 Zenrin Datacom Co Ltd Portable navigation device
CN103913174A (en) * 2012-12-31 2014-07-09 深圳先进技术研究院 Navigation information generation method and system, mobile client and server
US20160284125A1 (en) * 2015-03-23 2016-09-29 International Business Machines Corporation Path visualization for augmented reality display device based on received data and probabilistic analysis
CN205333084U (en) * 2015-12-08 2016-06-22 悠泊物联网科技(上海)有限公司 Car navigation system based on AR technique
CN107014395A (en) * 2017-03-31 2017-08-04 武汉大学 Directive property panoramic navigation system based on virtual reality technology
CN110843674A (en) * 2019-11-22 2020-02-28 深圳晨芯时代科技有限公司 On-vehicle display module assembly system based on AR augmented reality technique
CN111311756A (en) * 2020-02-11 2020-06-19 Oppo广东移动通信有限公司 Augmented reality AR display method and related device
CN112325883A (en) * 2020-10-19 2021-02-05 湖南大学 Indoor positioning method for mobile robot with WiFi and visual multi-source integration

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
周国清 周祥: "面阵激光雷达成像原理、技术及应用", pages: 162 - 164 *

Similar Documents

Publication Publication Date Title
KR101285360B1 (en) Point of interest displaying apparatus and method for using augmented reality
CN102338639B (en) Information processing device and information processing method
CN101561273B (en) Geographical data collecting device
US7457705B2 (en) Navigation apparatus for displaying three-d stored terrain information based on position and attitude
KR101436223B1 (en) Image identification using trajectory-based location determination
CN108917758B (en) Navigation method and system based on AR
EP2573513A1 (en) A computer implemented method for marking a point of interest in an image and a navigation device
KR20140054162A (en) Method for ensuring continuity of service of a personal navigation device and device thereof
JP2008158583A (en) Image-related information display system
JP2001503134A (en) Portable handheld digital geodata manager
JPWO2008068849A1 (en) Navigation system, portable terminal device, and peripheral image display method
KR101223741B1 (en) Measururement system for correction of image expression error
KR101413011B1 (en) Augmented Reality System based on Location Coordinates and Augmented Reality Image Providing Method thereof
KR20170094030A (en) System and Method for providing mapping of indoor navigation and panorama pictures
CN1979094A (en) Electronic device, display processing method and program
CN116086448B (en) UWB, IMU, GNSS fusion-based multi-scene seamless positioning method for unmanned equipment
US20200265644A1 (en) Method and system for generating merged reality images
CN112558129A (en) Method for determining indoor and outdoor scenes, related device, equipment and storage medium
CN110926479A (en) Method and system for automatically generating indoor three-dimensional navigation map model
CN112422927A (en) Real-time combination method and system for unmanned aerial vehicle shooting video and map
KR100679864B1 (en) Cellular phone capable of displaying geographic information and a method thereof
CN110031880A (en) High-precision augmented reality method and apparatus based on Geographic mapping
CN106525007A (en) Distributed interactive surveying and mapping universal robot
JPH10197277A (en) Direction indicator, direction indicating method, road guide system and road guide method
JP2019149652A (en) Information transmission device, communication system and communication terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination