CN112525185A - AR navigation method based on positioning and AR head-mounted display device - Google Patents

AR navigation method based on positioning and AR head-mounted display device Download PDF

Info

Publication number
CN112525185A
CN112525185A CN202011458842.3A CN202011458842A CN112525185A CN 112525185 A CN112525185 A CN 112525185A CN 202011458842 A CN202011458842 A CN 202011458842A CN 112525185 A CN112525185 A CN 112525185A
Authority
CN
China
Prior art keywords
interest
user
information
point
spatial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011458842.3A
Other languages
Chinese (zh)
Other versions
CN112525185B (en
Inventor
赵维奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Companion Technology Co ltd
Original Assignee
Hangzhou Companion Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Companion Technology Co ltd filed Critical Hangzhou Companion Technology Co ltd
Priority to CN202011458842.3A priority Critical patent/CN112525185B/en
Publication of CN112525185A publication Critical patent/CN112525185A/en
Application granted granted Critical
Publication of CN112525185B publication Critical patent/CN112525185B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/04Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means
    • G01C21/08Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means involving use of the magnetic field of the earth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The invention relates to an AR navigation method based on positioning and an AR head-mounted display device, wherein the method comprises the steps of determining the spatial three-dimensional coordinates of a user through a positioning system of the AR head-mounted display device; acquiring an interest point map associated with the spatial three-dimensional coordinate of the user according to the spatial three-dimensional coordinate of the user; acquiring the FOV orientation of the AR head-mounted display device through the FOV orientation detection system of the AR head-mounted display device; displaying AR information about the point of interest on an AR head mounted display device according to the user's spatial three-dimensional coordinates, FOV orientation, and the point of interest's spatial three-dimensional coordinates; and based on the position relation between the interest point and the user, displaying the AR information in a differentiated mode. By the method, the user can be assisted to locate the interest point and provide AR information about the interest point in a complex geographic environment.

Description

AR navigation method based on positioning and AR head-mounted display device
Technical Field
The invention relates to the field of AR navigation software, in particular to an AR navigation method based on positioning and an AR head-mounted display device.
Background
In the current AR navigation scheme, a mode of image recognition or two-dimensional code scanning is usually adopted to display corresponding AR information to a user. This approach is limited by the limitations of the scene and can generally only be used indoors. For outdoor AR navigation, due to the fact that the geographic environment is complex, training difficulty and false recognition rate of an AR navigation mode based on CV are high, and therefore the existing AR navigation scheme cannot bring good experience to users.
Disclosure of Invention
The positioning-based AR navigation method and the AR head-mounted display device can assist the user in accurately positioning the interest points and provide AR information about the interest points in a complex geographic environment to perform AR navigation for the user.
According to one aspect of the present invention, one or more embodiments of the present invention provide a method of AR navigation by positioning, comprising, by a positioning system of an AR head-mounted display device, determining spatial three-dimensional coordinates of a user; acquiring an interest point map associated with the spatial three-dimensional coordinate of the user according to the spatial three-dimensional coordinate of the user, wherein the interest point map comprises at least one interest point and the spatial three-dimensional coordinate of the at least one interest point; acquiring, by a FOV orientation detection system of the AR head-mounted display device, a FOV orientation of the AR head-mounted display device, wherein the FOV orientation is an orientation of a center of a FOV range corresponding to a display interface of the AR head-mounted display device; displaying AR information about the point of interest on an AR head mounted display device according to the spatial three-dimensional coordinates of the user, the FOV orientation, and the spatial three-dimensional coordinates of the point of interest such that the AR information and the point of interest correspond; and based on the position relation between the interest point and the user, displaying the AR information in a differentiated mode.
According to another aspect of the present invention, one or more embodiments of the present invention also provide an AR head mounted display apparatus, including a processor and a memory, the memory storing a computer program which, when executed by the processor, causes the AR head mounted display apparatus to perform the above-mentioned method.
The beneficial effects of one or more embodiments of the invention are as follows: the method does not depend on the traditional AR navigation, needs to determine interest points by using image recognition, code scanning and other modes, can realize navigation in complex scenes, particularly large scenic spots, and provides an efficient navigation interaction mode for users.
Drawings
Fig. 1 shows a flow diagram of a location-based AR navigation method in accordance with one or more embodiments of the invention;
FIG. 2 illustrates an AR head mounted display device in accordance with one or more embodiments of the invention;
FIG. 3 illustrates a schematic view of a scenic navigation in accordance with one or more embodiments of the present invention;
FIG. 4 shows a schematic view of FOV orientation in accordance with one or more embodiments of the invention;
FIG. 5 illustrates a display of AR information in accordance with one or more embodiments of the invention;
fig. 6 illustrates a display manner of AR information according to one or more embodiments of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description will be given to specific embodiments, structures, features and effects of the AR navigation method based on positioning and the AR head-mounted display device according to the present invention with reference to the accompanying drawings and preferred embodiments.
As shown in fig. 1, the flowchart of the location-based AR navigation method according to an embodiment of the present invention includes steps S1-S4, which specifically include:
s1, determining the spatial three-dimensional coordinates of the user through a positioning system of the AR head-mounted display device;
s2, obtaining an interest point map associated with the spatial three-dimensional coordinates of the user according to the spatial three-dimensional coordinates of the user, wherein the interest point map comprises at least one interest point and the spatial three-dimensional coordinates of the at least one interest point;
s3, acquiring the FOV orientation of the AR head-mounted display device through a FOV orientation detection system of the AR head-mounted display device, wherein the FOV orientation is the orientation of the center of the FOV range corresponding to the display interface of the AR head-mounted display device;
s4, displaying AR information about the point of interest on an AR head-mounted display device according to the spatial three-dimensional coordinates of the user, the FOV orientation, and the spatial three-dimensional coordinates of the point of interest, such that the AR information and the point of interest correspond.
In step S1, the spatial three-dimensional coordinates of the user are determined by the positioning system of the AR head mounted display device. Specifically, the AR head-mounted display device can superimpose virtual information onto the real world, so that the wearer can see the virtual information simultaneously while seeing the real world picture, thereby realizing mutual supplement of the two kinds of information. In one or more embodiments, the AR head-mounted display device includes different types of head-mounted devices such as AR/MR glasses, AR/MR head rings, AR/MR helmets, and the MR (mixed reality) and AR (augmented reality) are different in that the MR technology highlights virtual information and the AR technology highlights real information, and there is no essential difference in the technical solutions of the two, so in the present invention, the AR is collectively referred to as a technology including real augmentation such as AR and MR. Compared with the AR function of the smart phone, the AR head-mounted display device directly places the display module in front of the eyes of a user, so that both hands of the user can be liberated, and the center of the FOV (field of view) of the display module of the AR head-mounted display device is coincident with the center of the FOV of the user, so that immersive AR experience can be provided for the user. The FOV of the display module is the displayable area of the display module.
As shown in fig. 2, in one or more embodiments, the AR head mounted display device is AR glasses 100, and the AR glasses 100 may include one or two display devices 10 corresponding to one or both eyes of the fitting user. The display device can adopt a display scheme which can realize semi-transmission and semi-reflection by using a prism, an LCD, an optical waveguide, a Birdbath, a free-form surface reflector and the like. In addition, the AR glasses 100 may further include a frame 20, and in some embodiments, the sensor, the positioning system, the processing unit, the memory, and the battery of the AR glasses may be placed inside the frame 20, and in other embodiments, one or more components of the sensor, the positioning system, the processing unit, the memory, and the battery may be integrated into another separate accessory (not shown) and connected to the AR glasses 100 through a data line.
The AR glasses 100 include a positioning system therein that may be used to obtain spatial three-dimensional coordinates of the user. In one embodiment, the spatial three-dimensional coordinates may be determined from the location information and the elevation information. It is understood that the spatial three-dimensional coordinates of the user are three-dimensional coordinates of the user based on a certain reference system. In one embodiment, the three-dimensional spatial coordinates may be represented by longitude, latitude, and altitude, where the longitude and latitude are collectively referred to as positioning information. In one or more embodiments, the positioning system includes a satellite positioning device and a barometric pressure sensor. The satellite positioning device can be a device for positioning by using a satellite, such as a GPS (global positioning system), a Beidou and the like, and can acquire current accurate positioning information, namely longitude and latitude information. In other embodiments, other coordinate systems may be employed to determine the geographic location of the user, it being understood that different coordinate systems may each be converted to longitude, latitude, altitude. And the air pressure sensor acquires an air pressure value and calculates the altitude information through the air pressure value. Generally speaking, the higher the altitude, the lower the barometric pressure, and the physical relationship between the two is as follows:
Figure BDA0002830485510000031
wherein P is0Is standard atmospheric pressure, P is the currently measured barometric pressure, and a is altitude. After the barometric sensor obtains the currently measured barometric value, the altitude a value can be calculated by the above formula.
In addition, altitude information may also be obtained from the satellite positioning device and may be used to calibrate to altitude information obtained by the barometric pressure sensor. For example, the altitude value may be returned using the GPS via the Location Managr service provided by the Android system.
In step S2, a point of interest map associated with the spatial three-dimensional coordinates of the user is obtained according to the spatial three-dimensional coordinates of the user, where the point of interest map includes at least one point of interest and the spatial three-dimensional coordinates of the at least one point of interest. Specifically, the point-of-interest map associated with the spatial three-dimensional coordinates of the user may be a navigation map, a star distribution map, an unmanned aerial vehicle navigation map, or the like of a scenic spot where the user is located. In one embodiment, the spatial three-dimensional coordinates of the fixed position point of interest are generally predetermined. In another embodiment, for a movable interest point, a time series diagram of three-dimensional space coordinates can be provided according to the movement track of the movable interest point; or the spatial three-dimensional coordinates may be provided according to a positioning device on the interest point, and in one embodiment, the positioning method of the interest point may be consistent with the positioning method of the user described in step S1. In one embodiment, as shown in fig. 3, taking the scenic region navigation map as an example, the user is located at the top of a peak in the scenic region, the spatial three-dimensional coordinates of the user are (a0, b0, c0), the interest point is one or more scenic spots, and in fig. 3, there are 3 interest points having known three-dimensional coordinates (a1, b1, c1), (a2, b2, c2), (a3, b3, c3), respectively. For example, if one of the points of interest is a pyramid, the three-dimensional coordinate information of the pyramid can be determined as the highest point or the center point of the pyramid. In another embodiment, taking the star map as an example, the interest point is a specific constellation or star, and since the constellation or star is located at an infinite location by default, the three-dimensional coordinates of the constellation or star can be determined by the azimuth and the elevation of the star.
In some embodiments, a point-of-interest map associated with an electronic fence can be obtained by determining whether the spatial three-dimensional coordinates of the user fall within the electronic fence for a predetermined threshold time. The electronic fence can be a virtual area preset on the tourist site. Due to the accuracy problem of the positioning signal, when the user is near the edge of the electronic fence, the user is determined to alternately enter and exit the electronic fence, so that the experience of the user is affected, and therefore, by setting a threshold time, for example, 1s, 2s, 5s and the like, the user can be ensured not to provide an incorrect interest point map due to the problem of the positioning accuracy. For example, a charging area and a free area may be set in a scenic spot and separated by an electronic fence, and different points of interest may be provided for the charging area and the free area, or different AR information may be displayed for the same point of interest, thereby providing a diversified AR experience.
In step S3, a FOV orientation of the AR head-mounted display device is acquired by a FOV orientation detection system of the AR head-mounted display device, where the FOV orientation is an orientation of a center of a FOV range corresponding to a display interface of the AR head-mounted display device. Specifically, the FOV orientation refers to orientation data of a display interface of the AR head-mounted display device, which can be considered to be consistent with the visual field direction of the user because the AR head-mounted display device is directly worn in front of human eyes. As shown in fig. 4, when the user looks at the tower 200 far away through the AR head mounted display device 10, the viewing direction of the user and the orientation of the center line of the FOV of the AR head mounted display device are both shown by the dashed line 11. In one embodiment, the orientation of the FOV includes the horizontal direction of the FOV and the angle between the FOV and the horizontal direction, so the FOV orientation can be determined by the azimuth angle and the elevation angle as well. In some of these embodiments, the FOV orientation detection system of an AR head-mounted display device includes a magnetic force sensor and an IMU sensor. An IMU (inertial measurement unit) sensor generally refers to a combined unit consisting of 3 accelerometers and 3 gyroscopes, which are mounted on mutually perpendicular measurement axes, which can measure movements in 6 degrees of freedom (6 DOF). When a user wears the AR head-mounted display device, because the display interface of the AR head-mounted display device is overlapped with the visual field of a person, the orientation of the FOV of the AR head-mounted display device on a horizontal plane and the horizontal direction observed by the visual field of the user can be determined by detecting the intensity of the earth magnetic field through a magnetic sensor of the AR head-mounted display device; the head movement of the user can also be detected by an IMU sensor of the AR head-mounted display device to determine the angle between the user's field of view and the horizontal plane, i.e. the angle of the user's head up/down. From this, the horizontal and vertical directions observed by the user's field of view can be determined.
In step S4, displaying, on an AR head-mounted display device, AR information about the point of interest, such that the AR information and the point of interest correspond, according to the spatial three-dimensional coordinates of the user, the FOV orientation, and the spatial three-dimensional coordinates of the point of interest. Specifically, in the three-dimensional coordinates of the space, if the three-dimensional coordinates of two points are known, the geometric position relationship (e.g., distance, orientation, etc.) between the two points may be determined, and by calculating the connecting line between the user and the interest point and the spatial angle of the FOV orientation, since the size of the display area of the AR head-mounted display device is fixed and the distance between the AR head-mounted display device and the user's eye is known, the position of the plane where the display area is located may be calculated according to the FOV orientation, and thereby it may be approximately calculated whether the connecting line between the user's eye and the interest point passes through the plane where the display area is located, and thus the display area of the interest point on the AR head-mounted display device may be determined. Because the AR head-mounted display device can enable the user to achieve the semi-transparent and semi-reflective display effect, the AR information corresponding to the interest point can be displayed on the AR head-mounted display device, and therefore the user can observe the real information and the AR information of the interest point at the same time. In one or more embodiments, the display manner of the AR information is as shown in fig. 5, an iron tower 200 is a real building that can be seen by a user through the display device 10, the iron tower 200 is an example of a point of interest, and the AR information includes a text segment 201 and a video segment 202, and is connected to the iron tower 200 by means of an indication line, so that the user can correspond the AR information to the iron tower 200, thereby enhancing the user experience when visiting scenic spots. In one or more embodiments, the AR information may be played directly when the point of interest is identified, or the user may trigger the playing of the AR information through voice interaction or the like.
In addition, AR information may be differentially displayed according to whether or not a point of interest falls within the FOV. In some embodiments, if the point of interest falls within the FOV, displaying AR information corresponding to the point of interest on a display interface of the AR head-mounted display device. In other embodiments, if the point of interest is outside the FOV, the user is prompted on a display interface of the AR head mounted display device for the orientation of the point of interest. Thus, if the point of interest is not within the FOV, the user can be prompted to find the point of interest by rotating the view angle. As shown in fig. 5, in one embodiment, the tower 200 does not fall into the FOV, and a mark may be displayed on the side of the display device 10 near the tower 200 to prompt the user to continue moving the field of view to the left, so that the tower may be observed. In other embodiments, the user may be prompted in the display device 10 to perform head up, head down, move left, move right, etc. actions to observe the point of interest, depending on the spatial location of the point of interest and the FOV.
In one or more embodiments, the location-based AR navigation method disclosed in the present invention further includes step S5: and based on the position relation between the interest point and the user, displaying the AR information in a differentiated mode. In navigation based on the AR glasses, since the AR information is directly superimposed on the real scene, whether the AR information matches the real scene or not is important for the user to understand the relationship between the AR information and the real scene. In the traditional AR navigation mode based on the handheld intelligent device, the image of the real scene needs to be acquired through the camera, and the AR information is superposed on the image acquired by the camera. The user can not actually obtain the experience of combining virtuality and reality due to the size change of the image and the real scene after the camera is used for framing in the visual field of the user. Furthermore, due to the presence of positioning errors, FOV orientation measurement errors, the relationship of the AR information and the real scene may be incorrectly labeled for a plurality of adjacent points of interest. Therefore, the AR information can be displayed in a differentiated manner based on the position relationship between the interest points and the user, so that the user can clearly understand the change of the display state of the AR information caused by the change of the positions of the interest points, and the problem that the user cannot understand the matching relationship between the interest points and the AR information due to errors caused by the close distance between the interest points in the field of view can be avoided.
In some embodiments, one or more of a color, a size, a transparency, and a layer position of the AR information may be adjusted according to a distance between the point of interest and the user, so as to display the AR information in a differentiated manner. For example, AR information of points of interest farther away from the user may be located below the display layer and/or scaled on a near-to-far basis. In other embodiments, one or more of a color, a size, a transparency, and a layer position of the AR information may be adjusted according to the AR information and a distance between centers of display interfaces.
For example, if two or more interest points exist in the FOV, the AR information may be displayed in a manner of big-end-up and small-end-up, so that the user may correspond the AR information to the interest points; the differentiation may also be performed according to the color of the AR information, for example, the AR information of the information point closer to the user may be displayed in a darker color, or the AR information of the information point farther from the user may be displayed in a higher transparency. For example, the layer of the AR information closer to the user may be located at the uppermost layer, so as to avoid the occlusion of other AR information. For another example, because the center of the display module of the AR head-mounted display device is easier for the user to focus on the visual field, the AR information closer to the center of the display interface can be enlarged and highlighted, so that the user can obtain better AR experience.
Specifically, taking fig. 5 as an example, if it is actually detected that the distance between the towers of the user is 1KM, the sizes of the text 201 and the video 202 are displayed as shown in the size of fig. 5. When the user moves to a position of a distance of 500M, the sizes of the text 201 and the video 202 are enlarged by a certain ratio. If it is assumed that another interest point (not shown) exists in the scene of fig. 5, and the distance between the user and the another interest point is kept constant all the time, the size of the AR information corresponding to the another interest point is kept constant, and since the iron tower 200 in the user field of view is inevitably increased as the distance is decreased, and the synchronization of the AR information is increased, the user can feel the association between the AR information and the actual object, thereby improving the user experience.
In one or more embodiments, the location of the point of interest may be continuously transformed over time. For example, in the embodiment of using a star as the point of interest, the relative positions and times of the star and the user are related due to the earth's autonomy or the motion of the planet. Therefore, in some embodiments, a time parameter may also be obtained, and since there is a corresponding relationship between the motion trajectory of the star and the time, a point of interest map associated with the time parameter is further obtained. In addition, for example, in the embodiment that the unmanned aerial vehicle is used as the point of interest, since the position of the unmanned aerial vehicle changes in real time during the flight process, the update information of the positioning information and the altitude information of the unmanned aerial vehicle can be acquired from the unmanned aerial vehicle through communication connection, and the display position of the AR information is adjusted according to the update information.
In some preferred embodiments, due to the error of the positioning system, the acquisition of the three-dimensional spatial information of the user may not be accurate, and after step S4, the current scene may be identified using an image recognition algorithm to determine whether the displayed position of the AR information matches the interest point. For example, in fig. 5, when the user observes the iron tower 200, the position of the iron tower obtained by image recognition may be compared with the position of the iron tower identified by positioning, so as to determine whether the current position of the AR information display is consistent with the position of the iron tower identified by the image. When the features of the interest points are relatively definite, image recognition can be used for correction.
In other preferred embodiments, the user 'S three-dimensional spatial information may not be accurately obtained due to errors of the positioning system, and error correction may be performed according to the user' S feedback result after step S4. For example, for a map with multiple interest points, a user can be preferentially recommended to observe the interest points with obvious features, the user is enabled to confirm whether the current AR information is matched with the interest points or not by sending a judgment prompt to the user, and if the user feeds back that the matching is successful, the spatial three-dimensional coordinate of the current user is proved to be accurate. If the user confirms that the deviation exists, the user can move the head, so that the AR information and the real interest point are corrected. After the user has finished rectification, the same rectification parameters can be applied for matching of other interest points.
Also disclosed in accordance with one or more embodiments of the present invention is an AR head mounted display device comprising a processor and a memory, the memory storing a computer program that, when executed by the processor, causes the AR head mounted display device to perform the steps of S1-S4 described above. One embodiment of the AR head-mounted display device is the AR glasses 100 shown in fig. 2, and the specific steps performed therein are not described herein again.
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A method of location-based AR navigation, comprising,
determining spatial three-dimensional coordinates of a user through a positioning system of the AR head-mounted display device;
acquiring an interest point map associated with the spatial three-dimensional coordinate of the user according to the spatial three-dimensional coordinate of the user, wherein the interest point map comprises at least one interest point and the spatial three-dimensional coordinate of the at least one interest point;
acquiring, by a FOV orientation detection system of the AR head-mounted display device, a FOV orientation of the AR head-mounted display device, wherein the FOV orientation is an orientation of a center of a FOV range corresponding to a display interface of the AR head-mounted display device;
displaying AR information about the point of interest on an AR head mounted display device according to the user's spatial three-dimensional coordinates, FOV orientation, and the spatial three-dimensional coordinates of the point of interest, such that the AR information and the point of interest correspond,
and based on the position relation between the interest point and the user, displaying the AR information in a differentiated mode.
2. The method of claim 1,
the positioning system comprises a satellite positioning device and a barometric sensor, the spatial three-dimensional coordinates are determined by positioning information and altitude information, wherein,
the satellite positioning device acquires the positioning information and/or the altitude information,
and the air pressure sensor acquires an air pressure value and calculates the altitude information through the air pressure value.
3. The method of claim 1, wherein the FOV orientation comprises an orientation of the FOV in a horizontal plane and an angle of the FOV to the horizontal plane;
the FOV orientation detection system comprises a magnetic sensor and an IMU sensor,
the acquiring the FOV orientation further comprises:
determining, by the magnetic force sensor, an orientation of the FOV in a horizontal plane by detecting a strength of an Earth's magnetic field,
and detecting the head movement of the user through the IMU sensor, and determining the included angle between the FOV and the horizontal plane.
4. The method of claim 1, wherein,
the displaying, on an AR head-mounted display device, AR information about a point of interest according to the spatial three-dimensional coordinate of the user, the FOV orientation, and the spatial three-dimensional coordinate of the point of interest, such that the AR information corresponds to the point of interest, further comprising:
determining a geometric positional relationship of the FOV and the point of interest based on the spatial three-dimensional coordinates of the user and the spatial three-dimensional coordinates of the point of interest,
and according to the geometric position relation, calculating a connecting line of the user and the interest point and the spatial angle of the FOV orientation, and determining the display area of the interest point on the AR head-mounted display device.
5. The method of claim, further comprising,
if the interest point falls into the FOV range, displaying AR information corresponding to the interest point on a display interface of the AR head-mounted display device;
and if the interest point falls out of the FOV range, prompting the position of the interest point of the user on a display interface of the AR head-mounted display device.
6. The method of claims 1-5, wherein said differentially displaying AR information based on a positional relationship between said point of interest and a user, further comprises,
and adjusting one or more of the color, the size, the transparency and the layer position of the AR information according to the distance between the interest point and the user.
7. The method of claims 1-5, wherein said differentially displaying AR information based on a positional relationship between said point of interest and a user, further comprises,
and adjusting one or more of the color, the size, the transparency and the layer position of the AR information according to the distance between the AR information and the center of the display interface.
8. The method of claim 1, wherein obtaining a point-of-interest map associated with the spatial three-dimensional coordinates of the user according to the spatial three-dimensional coordinates of the user, further comprises:
the time parameter is obtained and used for the time,
and if the interest point map is related to the time parameter, further acquiring the interest point map related to the time parameter.
9. The method of claim 1, further comprising,
and receiving the update information of the spatial three-dimensional coordinate information of the interest point, and adjusting the display position of the AR information according to the update information.
10. An AR head mounted display device comprising a processor and a memory, the memory storing a computer program that, when executed by the processor, causes the AR head mounted display device to perform the method of any of claims 1-9.
CN202011458842.3A 2020-12-11 2020-12-11 AR navigation method based on positioning and AR head-mounted display device Active CN112525185B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011458842.3A CN112525185B (en) 2020-12-11 2020-12-11 AR navigation method based on positioning and AR head-mounted display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011458842.3A CN112525185B (en) 2020-12-11 2020-12-11 AR navigation method based on positioning and AR head-mounted display device

Publications (2)

Publication Number Publication Date
CN112525185A true CN112525185A (en) 2021-03-19
CN112525185B CN112525185B (en) 2022-10-28

Family

ID=74999114

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011458842.3A Active CN112525185B (en) 2020-12-11 2020-12-11 AR navigation method based on positioning and AR head-mounted display device

Country Status (1)

Country Link
CN (1) CN112525185B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116033348A (en) * 2023-03-30 2023-04-28 中国兵器科学研究院 Method and device for generating electronic fence
WO2024066752A1 (en) * 2022-09-29 2024-04-04 歌尔股份有限公司 Display control method and apparatus, head-mounted display device, and medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130278631A1 (en) * 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
US20140368533A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Multi-space connected virtual data objects
US20160055671A1 (en) * 2014-08-22 2016-02-25 Applied Research Associates, Inc. Techniques for Enhanced Accurate Pose Estimation
CN108830944A (en) * 2018-07-12 2018-11-16 北京理工大学 Optical perspective formula three-dimensional near-eye display system and display methods
CN110160529A (en) * 2019-06-17 2019-08-23 河南田野文化艺术有限公司 A kind of guide system of AR augmented reality
WO2020062267A1 (en) * 2018-09-30 2020-04-02 华为技术有限公司 Information prompt method and electronic device
US20200158873A1 (en) * 2017-06-12 2020-05-21 Vc Inc. Distance measuring apparatus and control method therefor

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130278631A1 (en) * 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
US20140368533A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Multi-space connected virtual data objects
US20160055671A1 (en) * 2014-08-22 2016-02-25 Applied Research Associates, Inc. Techniques for Enhanced Accurate Pose Estimation
US20200158873A1 (en) * 2017-06-12 2020-05-21 Vc Inc. Distance measuring apparatus and control method therefor
CN108830944A (en) * 2018-07-12 2018-11-16 北京理工大学 Optical perspective formula three-dimensional near-eye display system and display methods
WO2020062267A1 (en) * 2018-09-30 2020-04-02 华为技术有限公司 Information prompt method and electronic device
CN110160529A (en) * 2019-06-17 2019-08-23 河南田野文化艺术有限公司 A kind of guide system of AR augmented reality

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024066752A1 (en) * 2022-09-29 2024-04-04 歌尔股份有限公司 Display control method and apparatus, head-mounted display device, and medium
CN116033348A (en) * 2023-03-30 2023-04-28 中国兵器科学研究院 Method and device for generating electronic fence
CN116033348B (en) * 2023-03-30 2023-06-30 中国兵器科学研究院 Method and device for generating electronic fence

Also Published As

Publication number Publication date
CN112525185B (en) 2022-10-28

Similar Documents

Publication Publication Date Title
US11423586B2 (en) Augmented reality vision system for tracking and geolocating objects of interest
US11796309B2 (en) Information processing apparatus, information processing method, and recording medium
US10215989B2 (en) System, method and computer program product for real-time alignment of an augmented reality device
JP5443134B2 (en) Method and apparatus for marking the position of a real-world object on a see-through display
US9158305B2 (en) Remote control system
CN108022302B (en) Stereo display device of Inside-Out space orientation's AR
EP3008708B1 (en) Vision augmented navigation
US10269139B2 (en) Computer program, head-mounted display device, and calibration method
US20160117864A1 (en) Recalibration of a flexible mixed reality device
WO2019037489A1 (en) Map display method, apparatus, storage medium and terminal
CN107014378A (en) A kind of eye tracking aims at control system and method
WO2017219195A1 (en) Augmented reality displaying method and head-mounted display device
CN112525185B (en) AR navigation method based on positioning and AR head-mounted display device
US20130265331A1 (en) Virtual Reality Telescopic Observation System of Intelligent Electronic Device and Method Thereof
CN106370160A (en) Robot indoor positioning system and method
EP3642694B1 (en) Augmented reality system and method of displaying an augmented reality image
US11598636B2 (en) Location information display device and surveying system
JP2023075236A (en) Locus display device
US10559132B2 (en) Display apparatus, display system, and control method for display apparatus
CN113923437B (en) Information display method, processing device and display system thereof
CN108627157A (en) A kind of head based on three-dimensional marking plate shows localization method, device and three-dimensional marking plate
CN110769245A (en) Calibration method and related equipment
CN112558008B (en) Navigation method, system, equipment and medium based on optical communication device
JP2012059079A (en) Additional information display system, additional information display control method, and additional information display control program
JP2023062983A (en) Virtual iron tower display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant