CN112525185B - AR navigation method based on positioning and AR head-mounted display device - Google Patents
AR navigation method based on positioning and AR head-mounted display device Download PDFInfo
- Publication number
- CN112525185B CN112525185B CN202011458842.3A CN202011458842A CN112525185B CN 112525185 B CN112525185 B CN 112525185B CN 202011458842 A CN202011458842 A CN 202011458842A CN 112525185 B CN112525185 B CN 112525185B
- Authority
- CN
- China
- Prior art keywords
- interest
- user
- information
- point
- spatial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 238000001514 detection method Methods 0.000 claims abstract description 7
- 238000004590 computer program Methods 0.000 claims description 3
- 230000004886 head movement Effects 0.000 claims description 2
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 18
- 239000011521 glass Substances 0.000 description 9
- 229910052742 iron Inorganic materials 0.000 description 9
- 210000003128 head Anatomy 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/04—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means
- G01C21/08—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means involving use of the magnetic field of the earth
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Geology (AREA)
- General Life Sciences & Earth Sciences (AREA)
- Environmental & Geological Engineering (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Navigation (AREA)
Abstract
The invention relates to an AR navigation method based on positioning and an AR head-mounted display device, wherein the method comprises the steps of determining the spatial three-dimensional coordinates of a user through a positioning system of the AR head-mounted display device; acquiring an interest point map associated with the spatial three-dimensional coordinate of the user according to the spatial three-dimensional coordinate of the user; acquiring the FOV orientation of the AR head-mounted display device through the FOV orientation detection system of the AR head-mounted display device; displaying AR information about the point of interest on an AR head mounted display device according to the spatial three-dimensional coordinate of the user, the FOV orientation, and the spatial three-dimensional coordinate of the point of interest; and based on the position relation between the interest point and the user, displaying the AR information in a differentiated mode. By the method, the user can be assisted to locate the interest point and provide AR information about the interest point in a complex geographic environment.
Description
Technical Field
The invention relates to the field of AR navigation software, in particular to an AR navigation method based on positioning and an AR head-mounted display device.
Background
In the current AR navigation scheme, a mode of image recognition or two-dimensional code scanning is usually adopted to display corresponding AR information to a user. This method is limited by the constraints of the scenario and can generally only be used indoors. For outdoor AR navigation, due to the fact that the geographic environment is complex, training difficulty and false recognition rate of an AR navigation mode based on CV are high, and therefore the existing AR navigation scheme cannot bring good experience to users.
Disclosure of Invention
The positioning-based AR navigation method and the AR head-mounted display device can assist the user in accurately positioning the interest points and provide AR information about the interest points in a complex geographic environment to perform AR navigation for the user.
According to one aspect of the present invention, one or more embodiments of the present invention provide a method of AR navigation by positioning, comprising, by a positioning system of an AR head-mounted display device, determining spatial three-dimensional coordinates of a user; acquiring an interest point map associated with the spatial three-dimensional coordinate of the user according to the spatial three-dimensional coordinate of the user, wherein the interest point map comprises at least one interest point and the spatial three-dimensional coordinate of the at least one interest point; acquiring, by a FOV orientation detection system of the AR head-mounted display device, a FOV orientation of the AR head-mounted display device, wherein the FOV orientation is an orientation of a center of a FOV range corresponding to a display interface of the AR head-mounted display device; displaying AR information about the point of interest on an AR head mounted display device according to the spatial three-dimensional coordinates of the user, the FOV orientation, and the spatial three-dimensional coordinates of the point of interest such that the AR information and the point of interest correspond; and based on the position relation between the interest point and the user, displaying the AR information in a differentiated mode.
According to another aspect of the present invention, one or more embodiments of the present invention also provide an AR head mounted display apparatus, including a processor and a memory, the memory storing a computer program which, when executed by the processor, causes the AR head mounted display apparatus to perform the above-mentioned method.
The beneficial effects of one or more embodiments of the invention are as follows: the method does not depend on the traditional AR navigation, needs to determine the interest points by using image recognition, code scanning and other modes, can realize navigation in complex scenes, particularly large scenic spots, and provides an efficient navigation interaction mode for users.
Drawings
Fig. 1 shows a flow diagram of a location-based AR navigation method in accordance with one or more embodiments of the invention;
FIG. 2 illustrates an AR head mounted display device in accordance with one or more embodiments of the invention;
FIG. 3 illustrates a schematic view of a scenic navigation in accordance with one or more embodiments of the present invention;
FIG. 4 shows a schematic view of FOV orientation in accordance with one or more embodiments of the invention;
FIG. 5 illustrates a display of AR information in accordance with one or more embodiments of the invention;
fig. 6 illustrates a display manner of AR information according to one or more embodiments of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description will be given to specific embodiments, structures, features and effects of the AR navigation method based on positioning and the AR head-mounted display device according to the present invention with reference to the accompanying drawings and preferred embodiments.
As shown in fig. 1, the flowchart of the AR navigation method based on positioning according to an embodiment of the present invention includes steps S1 to S4, which specifically include:
s1, determining a spatial three-dimensional coordinate of a user through a positioning system of an AR head-mounted display device;
s2, obtaining an interest point map associated with the spatial three-dimensional coordinates of the user according to the spatial three-dimensional coordinates of the user, wherein the interest point map comprises at least one interest point and the spatial three-dimensional coordinates of the at least one interest point;
s3, acquiring the FOV orientation of the AR head-mounted display device through a FOV orientation detection system of the AR head-mounted display device, wherein the FOV orientation is the orientation of the center of the FOV range corresponding to the display interface of the AR head-mounted display device;
s4, displaying AR information about the interest point on an AR head-mounted display device according to the spatial three-dimensional coordinate, the FOV orientation and the spatial three-dimensional coordinate of the interest point of the user, so that the AR information and the interest point correspond to each other.
In step S1, spatial three-dimensional coordinates of the user are determined by a positioning system of the AR head-mounted display device. Specifically, the AR head-mounted display device can superimpose virtual information onto the real world, so that the wearer can see the virtual information simultaneously while seeing the real world picture, thereby realizing mutual supplement of the two kinds of information. In one or more embodiments, the AR head-mounted display device includes different types of head-mounted devices such as AR/MR glasses, AR/MR head rings, AR/MR helmets, and the MR (Mixed Reality) and AR (amplified Reality) are different in that the MR technology highlights virtual information and the AR technology highlights real information, and there is no essential difference in technical solutions of the two, so in the present invention, the AR is collectively referred to as a technology including real amplification such as AR and MR. AR wear-type display device compares in smart mobile phone's AR function, and AR wear-type display device directly places the display module assembly in the people's eye in the front, can liberate user's both hands, and the FOV (Field of View) center of AR wear-type display device's display module assembly and user's FOV center coincidence can provide immersive AR for the user and experience. The FOV of the display module is the displayable area of the display module.
As shown in fig. 2, in one or more embodiments, the AR head mounted display device is AR glasses 100, and the AR glasses 100 may include one or two display devices 10 corresponding to one or both eyes of the fitting user. The display device can adopt a display scheme which can realize semi-transmission and semi-reflection by using a prism, an LCD, an optical waveguide, a Birdbath, a free-form surface reflector and the like. In addition, the AR glasses 100 may further include a frame 20, and in some embodiments, the sensor, the positioning system, the processing unit, the memory, and the battery of the AR glasses may be placed inside the frame 20, and in other embodiments, one or more components of the sensor, the positioning system, the processing unit, the memory, and the battery may be integrated into another separate accessory (not shown) and connected to the AR glasses 100 through a data line.
The AR glasses 100 include a positioning system therein that can be used to obtain spatial three-dimensional coordinates of the user. In one embodiment, the spatial three-dimensional coordinates may be determined from the location information and the altitude information. It is understood that the spatial three-dimensional coordinates of the user are three-dimensional coordinates of the user based on a certain reference system. In one embodiment, the three-dimensional spatial coordinates may be represented by longitude, latitude, and altitude, where the longitude and latitude are collectively referred to as positioning information. In one or more embodiments, the positioning system includes a satellite positioning device and a barometric pressure sensor. The satellite positioning device can be a device for positioning by using a satellite, such as a GPS (global positioning system), a Beidou and the like, and can acquire current accurate positioning information, namely longitude and latitude information. In other embodiments, other coordinate systems may be employed to determine the geographic location of the user, it being understood that different coordinate systems may each be converted to longitude, latitude, altitude. And the air pressure sensor acquires an air pressure value and calculates the altitude information through the air pressure value. Generally speaking, the higher the altitude, the lower the barometric pressure, and the physical relationship between the two is as follows:
wherein P is 0 Is standard atmospheric pressure, P is the currently measured barometric pressure, and a is altitude. After the barometric sensor obtains the currently measured barometric value, the altitude a value can be calculated by the above formula.
In addition, altitude information may also be obtained from the satellite positioning device and may be used to calibrate to altitude information obtained by the barometric pressure sensor. For example, the altitude value may be returned using the GPS global positioning system through the Location Managr service provided by the Android system.
In step S2, according to the spatial three-dimensional coordinate of the user, an interest point map associated with the spatial three-dimensional coordinate of the user is obtained, where the interest point map includes at least one interest point and the spatial three-dimensional coordinate of the at least one interest point. Specifically, the point-of-interest map associated with the spatial three-dimensional coordinates of the user may be a navigation map, a star distribution map, an unmanned aerial vehicle navigation map, or the like of a scenic spot where the user is located. In one embodiment, the spatial three-dimensional coordinates of a fixed location point of interest are generally predetermined. In another embodiment, for a movable interest point, a time series diagram of spatial three-dimensional coordinates can be provided according to the movement track of the movable interest point; or the spatial three-dimensional coordinates may be provided according to a positioning device on the interest point, and in one embodiment, the positioning method of the interest point may be consistent with the positioning method of the user described in step S1. In one embodiment, as shown in fig. 3, taking the scenic region navigation map as an example, the user is located at the top of a peak in the scenic region, the spatial three-dimensional coordinates of the user are (a 0, b0, c 0), the interest point is one or more scenic spots, and in fig. 3, there are 3 interest points having known three-dimensional coordinates (a 1, b1, c 1), (a 2, b2, c 2), (a 3, b3, c 3). For example, if one of the points of interest is a pyramid, the three-dimensional coordinate information of the pyramid can be determined as the highest point or the center point of the pyramid. In another embodiment, taking the star map as an example, the interest point is a specific constellation or star, and since the constellation or star is located at an infinite location by default, the three-dimensional coordinates of the constellation or star can be determined by the azimuth and the elevation of the star.
In some embodiments, a point-of-interest map associated with an electronic fence can be obtained by determining whether the spatial three-dimensional coordinates of the user fall within the electronic fence for a predetermined threshold time. The electronic fence can be a virtual area preset on the tourist site. Due to the accuracy problem of the positioning signal, when the user is near the edge of the electronic fence, the user is determined to alternately enter and exit the electronic fence, so that the user experience is affected, and therefore, by setting a threshold time, for example, 1s, 2s, 5s and the like, the user can be ensured not to provide an incorrect point of interest map due to the positioning accuracy problem. For example, a charging area and a free area may be set in a scenic spot and separated by an electronic fence, and different points of interest may be provided for the charging area and the free area, or different AR information may be displayed for the same point of interest, thereby providing a diversified AR experience.
In step S3, acquiring, by a FOV orientation detection system of the AR head-mounted display device, a FOV orientation of the AR head-mounted display device, where the FOV orientation is an orientation of a center of a FOV range corresponding to a display interface of the AR head-mounted display device. Specifically, the FOV orientation refers to orientation data of a display interface of the AR head-mounted display device, which can be considered to be consistent with the visual field direction of the user because the AR head-mounted display device is directly worn in front of human eyes. As shown in fig. 4, when the user looks at the tower 200 far away through the AR head mounted display device 10, the viewing direction of the user and the orientation of the center line of the FOV of the AR head mounted display device are both shown by the dashed line 11. In one embodiment, the orientation of the FOV includes the horizontal direction of the FOV and the angle between the FOV and the horizontal direction, so the FOV orientation can be determined by the azimuth angle and the elevation angle as well. In some of these embodiments, the FOV orientation detection system of an AR head-mounted display device includes a magnetic force sensor and an IMU sensor. An IMU (inertial measurement unit) sensor generally refers to a combined unit consisting of 3 accelerometers and 3 gyroscopes, which are mounted on mutually perpendicular measurement axes, which can measure movements in 6 degrees of freedom (6 DOF). When a user wears the AR head-mounted display device, because the display interface of the AR head-mounted display device is overlapped with the visual field of a person, the orientation of the FOV of the AR head-mounted display device on a horizontal plane and the horizontal direction observed by the visual field of the user can be determined by detecting the intensity of the earth magnetic field through a magnetic sensor of the AR head-mounted display device; the head movement of the user can also be detected by an IMU sensor of the AR head-mounted display device to determine the angle between the user's field of view and the horizontal plane, i.e. the angle of the user's head up/down. From this, the horizontal and vertical directions observed by the user's field of view can be determined.
In step S4, displaying, on an AR head-mounted display device, AR information about the point of interest according to the spatial three-dimensional coordinates of the user, the FOV orientation, and the spatial three-dimensional coordinates of the point of interest, such that the AR information and the point of interest correspond. Specifically, in the three-dimensional coordinates of the space, if the three-dimensional coordinates of two points are known, the geometric position relationship (e.g., distance, orientation, etc.) between the two points may be determined, and by calculating the connecting line between the user and the interest point and the spatial angle of the FOV orientation, since the size of the display area of the AR head-mounted display device is fixed and the distance between the AR head-mounted display device and the user's eye is known, the position of the plane where the display area is located may be calculated according to the FOV orientation, and thereby it may be approximately calculated whether the connecting line between the user's eye and the interest point passes through the plane where the display area is located, and thus the display area of the interest point on the AR head-mounted display device may be determined. Because the AR head-mounted display device can enable the user to achieve the semi-transparent and semi-reflective display effect, the AR information corresponding to the interest point can be displayed on the AR head-mounted display device, and therefore the user can observe the real information and the AR information of the interest point at the same time. In one or more embodiments, the display manner of the AR information is as shown in fig. 5, an iron tower 200 is a real building that can be seen by a user through the display device 10, the iron tower 200 is an example of a point of interest, and the AR information includes a text segment 201 and a video segment 202, and is connected to the iron tower 200 by means of an indication line, so that the user can correspond the AR information to the iron tower 200, thereby enhancing the user experience when visiting scenic spots. In one or more embodiments, the AR information may be played directly when the point of interest is identified, or the user may trigger the playing of the AR information through voice interaction or the like.
In addition, AR information may be differentially displayed according to whether or not a point of interest falls within the FOV. In some embodiments, if the point of interest falls into the FOV, displaying, on a display interface of the AR head-mounted display device, AR information corresponding to the point of interest. In other embodiments, if the point of interest is outside the FOV, the user is prompted for the location of the point of interest on a display interface of the AR head mounted display device. Thus, if the point of interest is not within the FOV, the user can be prompted to find the point of interest by rotating the view angle. As shown in fig. 5, in one embodiment, the tower 200 does not fall into the FOV, and a mark may be displayed on the side of the display device 10 near the tower 200, prompting the user to continue moving the field of view to the left, so that the tower may be observed. In other embodiments, the user may be prompted in the display device 10 to perform head up, head down, move left, move right, etc. actions to observe the point of interest, depending on the spatial location of the point of interest and the FOV.
In one or more embodiments, the location-based AR navigation method disclosed in the present invention further includes step S5: and based on the position relation between the interest point and the user, displaying the AR information in a differentiated mode. In navigation based on the AR glasses, since the AR information is directly superimposed on the real scene, whether the AR information matches the real scene or not is important for the user to understand the relationship between the AR information and the real scene. In the traditional AR navigation mode based on the handheld intelligent device, the image of the real scene needs to be acquired through the camera, and the AR information is superposed on the image acquired by the camera. The user can not actually obtain the experience of combining virtuality and reality due to the size change of the image and the real scene after the camera views in the user field. Furthermore, due to the presence of positioning errors, FOV orientation measurement errors, the relationship of the AR information and the real scene may be incorrectly labeled for a plurality of adjacent points of interest. Therefore, the AR information can be displayed in a differentiated manner based on the position relationship between the interest points and the user, so that the user can clearly understand the change of the display state of the AR information caused by the change of the positions of the interest points, and the problem that the user cannot understand the matching relationship between the interest points and the AR information due to errors caused by the close distance between the interest points in the field of view can be avoided.
In some embodiments, one or more of a color, a size, a transparency, and a layer position of the AR information may be adjusted according to a distance between the point of interest and the user, so as to display the AR information in a differentiated manner. For example, AR information of a point of interest farther away from the user may be located below the display layer and/or scaled according to the principle of near-far. In other embodiments, one or more of a color, a size, a transparency, and a layer position of the AR information may be adjusted according to the AR information and a distance between centers of display interfaces.
For example, if two or more interest points exist in the FOV, the AR information may be displayed in a manner of big-end-up and small-end-up, so that the user may correspond the AR information to the interest points; the differentiation may also be performed according to the color of the AR information, for example, the AR information of the information point closer to the user may be displayed in a darker color, or the AR information of the information point farther from the user may be displayed in a higher transparency. For example, the layer of the AR information closer to the user may be located at the uppermost layer, so as to avoid the occlusion of other AR information. For another example, because the center of the display module of the AR head-mounted display device is easier for the user to focus the visual field, the AR information closer to the center of the display interface may be enlarged and highlighted, so that the user may obtain better AR experience.
Specifically, taking fig. 5 as an example, if it is actually detected that the distance between towers in the user is 1KM, the sizes of the text 201 and the video 202 are displayed as shown in fig. 5. When the user moves to a position of a distance of 500M, the sizes of the text 201 and the video 202 are enlarged by a certain ratio. If it is assumed that another interest point (not shown) exists in the scene of fig. 5, and the distance between the user and the another interest point is kept constant all the time, the size of the AR information corresponding to the another interest point is kept constant, and since the iron tower 200 in the user field of view is inevitably increased as the distance is decreased, and the synchronization of the AR information is increased, the user can feel the association between the AR information and the actual object, thereby improving the user experience.
In one or more embodiments, the location of the point of interest may be continuously transformed over time. For example, in the embodiment of using a star as the interest point, the relative positions and times of the star and the user are related due to the movement of the earth's autonomy or planet. Therefore, in some embodiments, a time parameter may also be obtained, and since there is a corresponding relationship between the motion trajectory of the star and the time, a point of interest map associated with the time parameter is further obtained. In addition, for example, in the embodiment that the unmanned aerial vehicle is used as the point of interest, since the position of the unmanned aerial vehicle changes in real time during the flight process, the update information of the positioning information and the altitude information of the unmanned aerial vehicle can be acquired from the unmanned aerial vehicle through communication connection, and the display position of the AR information is adjusted according to the update information.
In some preferred embodiments, due to an error of the positioning system, the acquisition of the three-dimensional spatial information of the user may be inaccurate, and after step S4, the current scene may be identified using an image recognition algorithm to determine whether the displayed position of the AR information matches the interest point. For example, in fig. 5, when the user observes the iron tower 200, the position of the iron tower obtained by image recognition may be compared with the position of the iron tower identified by positioning, so as to determine whether the current position of the AR information display is consistent with the position of the iron tower identified by the image. When the features of the interest points are relatively definite, image recognition can be used for correction.
In other preferred embodiments, the acquisition of the three-dimensional spatial information of the user may be inaccurate due to errors of the positioning system, and after step S4, error correction may be performed according to the feedback result of the user. For example, for a map with multiple interest points, a user can be preferentially recommended to observe the interest points with obvious features, the user is enabled to confirm whether the current AR information is matched with the interest points or not by sending a judgment prompt to the user, and if the user feeds back that the matching is successful, the spatial three-dimensional coordinate of the current user is proved to be accurate. If the user confirms that the deviation exists, the user can move the head, so that the AR information and the real interest point are corrected. After the user has finished rectification, the same rectification parameters can be applied for matching of other interest points.
Also disclosed in accordance with one or more embodiments of the present invention is an AR head mounted display device comprising a processor and a memory, the memory storing a computer program that, when executed by the processor, causes the AR head mounted display device to perform the steps of S1-S4 described above. One embodiment of the AR head-mounted display device is the AR glasses 100 shown in fig. 2, and the specific steps performed therein are not described herein again.
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (9)
1. A method of location-based AR navigation, comprising,
determining spatial three-dimensional coordinates of a user through a positioning system of the AR head-mounted display device;
acquiring an interest point map associated with the spatial three-dimensional coordinate of the user according to the spatial three-dimensional coordinate of the user, wherein the interest point map comprises at least one interest point and the spatial three-dimensional coordinate of the at least one interest point;
acquiring, by a FOV orientation detection system of the AR head-mounted display device, a FOV orientation of the AR head-mounted display device, wherein the FOV orientation is an orientation of a center of a FOV range corresponding to a display interface of the AR head-mounted display device;
displaying AR information about the point of interest on an AR head mounted display device according to the user's spatial three-dimensional coordinates, FOV orientation, and the spatial three-dimensional coordinates of the point of interest, such that the AR information and the point of interest correspond,
based on the position relation between the interest points and the user, displaying the AR information in a differentiated mode;
the displaying, on an AR head-mounted display device, AR information about a point of interest according to the spatial three-dimensional coordinate of the user, the FOV orientation, and the spatial three-dimensional coordinate of the point of interest, such that the AR information corresponds to the point of interest, further comprising:
determining a geometric positional relationship of the FOV and the point of interest based on the spatial three-dimensional coordinates of the user and the spatial three-dimensional coordinates of the point of interest,
and according to the geometric position relation, calculating a connecting line of the user and the interest point and the spatial angle of the FOV orientation, and determining the display area of the interest point on the AR head-mounted display device.
2. The method of claim 1,
the positioning system comprises a satellite positioning device and a barometric sensor, the spatial three-dimensional coordinates are determined by positioning information and altitude information, wherein,
the satellite positioning device acquires the positioning information and/or the altitude information,
and the air pressure sensor acquires an air pressure value and calculates the altitude information through the air pressure value.
3. The method of claim 1, wherein the FOV orientation comprises an orientation of the FOV in a horizontal plane and an angle of the FOV to the horizontal plane;
the FOV orientation detection system comprises a magnetic sensor and an IMU sensor,
acquiring the FOV orientation further comprises:
determining, by the magnetic force sensor, an orientation of the FOV in a horizontal plane by detecting a strength of an Earth's magnetic field,
and detecting the head movement of the user through the IMU sensor, and determining the included angle between the FOV and the horizontal plane.
4. The method of claim 1, further comprising,
if the interest point falls into the FOV range, displaying AR information corresponding to the interest point on a display interface of the AR head-mounted display device;
and if the interest point falls out of the FOV range, prompting the position of the interest point of the user on a display interface of the AR head-mounted display device.
5. The method of any of claims 1-4, wherein the differentially displaying AR information based on a positional relationship between the point of interest and a user, further comprises,
and adjusting one or more of the color, the size, the transparency and the layer position of the AR information according to the distance between the interest point and the user.
6. The method of any of claims 1-4, wherein the differentially displaying AR information based on a positional relationship between the point of interest and a user, further comprises,
and adjusting one or more of the color, the size, the transparency and the layer position of the AR information according to the distance between the AR information and the center of the display interface.
7. The method of claim 1, wherein obtaining a point-of-interest map associated with the spatial three-dimensional coordinates of the user according to the spatial three-dimensional coordinates of the user, further comprises:
the time parameter is obtained and used for the time,
and if the interest point map is related to the time parameter, further acquiring the interest point map related to the time parameter.
8. The method of claim 1, further comprising,
and receiving the update information of the spatial three-dimensional coordinate information of the interest point, and adjusting the display position of the AR information according to the update information.
9. An AR head mounted display device comprising a processor and a memory, the memory storing a computer program that, when executed by the processor, causes the AR head mounted display device to perform the method of any of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011458842.3A CN112525185B (en) | 2020-12-11 | 2020-12-11 | AR navigation method based on positioning and AR head-mounted display device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011458842.3A CN112525185B (en) | 2020-12-11 | 2020-12-11 | AR navigation method based on positioning and AR head-mounted display device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112525185A CN112525185A (en) | 2021-03-19 |
CN112525185B true CN112525185B (en) | 2022-10-28 |
Family
ID=74999114
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011458842.3A Active CN112525185B (en) | 2020-12-11 | 2020-12-11 | AR navigation method based on positioning and AR head-mounted display device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112525185B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115035626A (en) * | 2022-05-19 | 2022-09-09 | 成都中科大旗软件股份有限公司 | Intelligent scenic spot inspection system and method based on AR |
CN115599206A (en) * | 2022-09-29 | 2023-01-13 | 歌尔科技有限公司(Cn) | Display control method, display control device, head-mounted display equipment and medium |
CN116033348B (en) * | 2023-03-30 | 2023-06-30 | 中国兵器科学研究院 | Method and device for generating electronic fence |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110160529A (en) * | 2019-06-17 | 2019-08-23 | 河南田野文化艺术有限公司 | A kind of guide system of AR augmented reality |
WO2020062267A1 (en) * | 2018-09-30 | 2020-04-02 | 华为技术有限公司 | Information prompt method and electronic device |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130278631A1 (en) * | 2010-02-28 | 2013-10-24 | Osterhout Group, Inc. | 3d positioning of augmented reality information |
US9235051B2 (en) * | 2013-06-18 | 2016-01-12 | Microsoft Technology Licensing, Llc | Multi-space connected virtual data objects |
US9875579B2 (en) * | 2014-08-22 | 2018-01-23 | Applied Research Associates, Inc. | Techniques for enhanced accurate pose estimation |
KR20180135395A (en) * | 2017-06-12 | 2018-12-20 | 주식회사 유컴테크놀러지 | Distance measuring apparatus and method for controlling the same |
CN108830944B (en) * | 2018-07-12 | 2020-10-16 | 北京理工大学 | Optical perspective three-dimensional near-to-eye display system and display method |
-
2020
- 2020-12-11 CN CN202011458842.3A patent/CN112525185B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020062267A1 (en) * | 2018-09-30 | 2020-04-02 | 华为技术有限公司 | Information prompt method and electronic device |
CN110160529A (en) * | 2019-06-17 | 2019-08-23 | 河南田野文化艺术有限公司 | A kind of guide system of AR augmented reality |
Also Published As
Publication number | Publication date |
---|---|
CN112525185A (en) | 2021-03-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11423586B2 (en) | Augmented reality vision system for tracking and geolocating objects of interest | |
CN112525185B (en) | AR navigation method based on positioning and AR head-mounted display device | |
US20210131790A1 (en) | Information processing apparatus, information processing method, and recording medium | |
US10215989B2 (en) | System, method and computer program product for real-time alignment of an augmented reality device | |
JP5443134B2 (en) | Method and apparatus for marking the position of a real-world object on a see-through display | |
US9401050B2 (en) | Recalibration of a flexible mixed reality device | |
EP3008708B1 (en) | Vision augmented navigation | |
CN108022302B (en) | Stereo display device of Inside-Out space orientation's AR | |
CN107771342B (en) | Augmented reality display method and head-mounted display equipment | |
US9158305B2 (en) | Remote control system | |
WO2019037489A1 (en) | Map display method, apparatus, storage medium and terminal | |
EP3642694B1 (en) | Augmented reality system and method of displaying an augmented reality image | |
CN106370160A (en) | Robot indoor positioning system and method | |
US11598636B2 (en) | Location information display device and surveying system | |
JP2018189470A (en) | Survey system | |
JP2013235367A (en) | Flight path display system, method, and program | |
US10559132B2 (en) | Display apparatus, display system, and control method for display apparatus | |
CN112558008B (en) | Navigation method, system, equipment and medium based on optical communication device | |
JP2023075236A (en) | Locus display device | |
CN108627157A (en) | A kind of head based on three-dimensional marking plate shows localization method, device and three-dimensional marking plate | |
JP2022140903A (en) | Surveying device and surveying method using the same | |
JP2012059079A (en) | Additional information display system, additional information display control method, and additional information display control program | |
JP2023077070A (en) | Method of aligning virtual space with respect to real space | |
JP2023062983A (en) | Virtual iron tower display system | |
CN117268378A (en) | Navigation method, computer device, and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20210319 Assignee: Hangzhou Shangqi Digital Technology Co.,Ltd. Assignor: Hangzhou companion Technology Co.,Ltd. Contract record no.: X2024980010907 Denomination of invention: Location based AR navigation method and AR head mounted display device Granted publication date: 20221028 License type: Common License Record date: 20240730 |