CN112729327A - Navigation method, navigation device, computer equipment and storage medium - Google Patents

Navigation method, navigation device, computer equipment and storage medium Download PDF

Info

Publication number
CN112729327A
CN112729327A CN202011549889.0A CN202011549889A CN112729327A CN 112729327 A CN112729327 A CN 112729327A CN 202011549889 A CN202011549889 A CN 202011549889A CN 112729327 A CN112729327 A CN 112729327A
Authority
CN
China
Prior art keywords
equipment
dimensional
determining
pose information
inflection point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011549889.0A
Other languages
Chinese (zh)
Other versions
CN112729327B (en
Inventor
张建博
李宇飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202011549889.0A priority Critical patent/CN112729327B/en
Publication of CN112729327A publication Critical patent/CN112729327A/en
Application granted granted Critical
Publication of CN112729327B publication Critical patent/CN112729327B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a navigation method, apparatus, computer device and storage medium, including: acquiring a live-action image shot by AR equipment, and determining three-dimensional pose information of the AR equipment; acquiring a navigation path of the AR equipment based on the three-dimensional pose information of the AR equipment and the destination information of the AR equipment; determining a display posture of a three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path; and displaying an AR navigation map containing the three-dimensional virtual arrow and the live-action image on the AR equipment according to the display posture of the three-dimensional virtual arrow.

Description

Navigation method, navigation device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of Augmented Reality (AR) technologies, and in particular, to a navigation method, an apparatus, a computer device, and a storage medium.
Background
With the development of science and technology, people's travel is more and more dependent on a navigation system. When navigating a user, a navigation path to the destination is generally determined by the destination input by the user, and navigation information is presented to the user based on the navigation path.
However, when displaying navigation information for a user, the navigation information is generally displayed through a two-dimensional indication arrow, which easily causes that the user cannot accurately judge the driving direction, for example, at a certain branch road junction, the user can go straight on an uphill slope, and the direction pointed by the two-dimensional indication arrow when indicating the uphill slope is not greatly different from the direction pointed by the two-dimensional indication arrow when indicating the straight line, so that the user cannot accurately judge how to walk, and the navigation effect is poor.
Disclosure of Invention
The embodiment of the disclosure at least provides a navigation method, a navigation device, computer equipment and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a navigation method, including:
acquiring a live-action image shot by AR equipment, and determining three-dimensional pose information of the AR equipment;
acquiring a navigation path of the AR equipment based on the three-dimensional pose information of the AR equipment and the destination information of the AR equipment;
determining a display posture of a three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path;
and displaying an AR navigation map containing the three-dimensional virtual arrow and the live-action image on the AR equipment according to the display posture of the three-dimensional virtual arrow.
Based on the method, when the navigation is performed for the user, the AR navigation chart containing the three-dimensional virtual arrow and the live-action image is displayed for the user through the AR equipment, so that the display form of the navigation is enriched; the navigation is carried out through the three-dimensional virtual arrow, the three-dimensional virtual arrow can indicate a plurality of directions, the method for indicating the three-dimensional virtual arrow is more definite, and the problem that a user cannot accurately judge the driving direction is solved.
In a possible implementation, the acquiring the live-action image taken by the AR device includes:
and after the AR equipment is determined to start the navigation application, calling an image acquisition device of the AR equipment to shoot the live-action image in real time.
In a possible embodiment, the determining the presentation posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path includes:
determining a target inflection point in the navigation path closest to the AR device;
and determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR equipment and the position information of the target inflection point.
Here, based on the AR device and the position information of the target inflection point, the determined display posture of the three-dimensional virtual arrow may accurately guide the user to reach the target inflection point, thereby improving the navigation effect.
In one possible implementation, the three-dimensional pose information of the AR device includes position information and orientation information of the AR device; the three-dimensional virtual arrow has six degrees of freedom;
the determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the position information of the target inflection point comprises:
determining a state variation corresponding to the three-dimensional virtual arrow in each degree of freedom based on the three-dimensional pose information of the AR device and the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the state variation of the three-dimensional virtual arrow corresponding to each degree of freedom.
The three-dimensional virtual arrow with six degrees of freedom can indicate each direction, the three-dimensional virtual arrow with six degrees of freedom indicates that a user reaches a target inflection point, and the indicating effect is more definite.
In one possible implementation manner, each inflection point in the navigation path corresponds to a turning region, and the turning region of a current inflection point is a region which is away from the current inflection point by a preset distance in a straight line from a last inflection point to the current inflection point;
when the AR device is located in a turning area of the target turning point, determining a display posture of the three-dimensional virtual arrow according to the following method:
determining a next inflection point corresponding to the target inflection point and position information of the next inflection point based on the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the pose information of the AR equipment and the position information of the next inflection point.
When the AR equipment is located in a turning area of the target turning point, the AR equipment is closer to the target turning point, in this case, the display posture of the three-dimensional virtual arrow is determined based on the position information of the next turning point, the turning information corresponding to the target turning point can be prompted to the user in advance, and the user can prepare for turning at the target turning point in advance.
In one possible implementation, the three-dimensional pose information of the AR device includes position information and orientation information of the AR device;
determining three-dimensional pose information of the AR device according to:
and determining the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment and a pre-constructed three-dimensional scene map.
In one possible implementation, the three-dimensional pose information of the AR device is determined according to the following method:
and acquiring data acquired by an inertial measurement unit in the AR equipment, and determining the three-dimensional pose information of the AR equipment based on the data acquired by the inertial measurement unit.
Because the three-dimensional pose of the AR equipment is changed in real time, the three-dimensional pose information of the AR equipment is determined based on the data collected by the inertial measurement unit, the three-dimensional pose information of the AR equipment can be updated in real time, and the display posture of the three-dimensional virtual arrow can be further adjusted in real time.
In a possible embodiment, the method further comprises:
and calibrating the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment every other preset time.
Due to the fact that the measurement accuracy error of the inertial measurement unit is large, the three-dimensional pose of the AR equipment is calibrated based on the live-action image shot by the AR equipment, the error influence caused by the measurement accuracy of the inertial measurement unit can be avoided, and the display accuracy of the three-dimensional virtual arrow display pose is further improved.
In a second aspect, an embodiment of the present disclosure further provides a navigation device, including:
the first determining module is used for acquiring a live-action image shot by the AR equipment and determining three-dimensional pose information of the AR equipment;
the acquisition module is used for acquiring a navigation path of the AR equipment based on the three-dimensional pose information of the AR equipment and the destination information of the AR equipment;
the second determination module is used for determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR equipment and the navigation path;
and the display module is used for displaying the AR navigation chart containing the three-dimensional virtual arrow and the live-action image on the AR equipment according to the display posture of the three-dimensional virtual arrow.
In one possible implementation, the first determining module, when acquiring the live-action image taken by the AR device, is configured to:
and after the AR equipment is determined to start the navigation application, calling an image acquisition device of the AR equipment to shoot the live-action image in real time.
In one possible embodiment, the second determining module, when determining the presentation posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path, is configured to:
determining a target inflection point in the navigation path closest to the AR device;
and determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR equipment and the position information of the target inflection point.
In one possible implementation, the three-dimensional pose information of the AR device includes position information and orientation information of the AR device; the three-dimensional virtual arrow has six degrees of freedom;
the second determination module, when determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the position information of the target inflection point, is configured to:
determining a state variation corresponding to the three-dimensional virtual arrow in each degree of freedom based on the three-dimensional pose information of the AR device and the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the state variation of the three-dimensional virtual arrow corresponding to each degree of freedom.
In one possible implementation manner, each inflection point in the navigation path corresponds to a turning region, and the turning region of a current inflection point is a region which is away from the current inflection point by a preset distance in a straight line from a last inflection point to the current inflection point;
when the AR device is located in a turning region of the target inflection point, the second determination method is used for determining the display pose of the three-dimensional virtual arrow according to the following method:
determining a next inflection point corresponding to the target inflection point and position information of the next inflection point based on the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the pose information of the AR equipment and the position information of the next inflection point.
In one possible implementation, the three-dimensional pose information of the AR device includes position information and orientation information of the AR device;
the second determination module is configured to determine three-dimensional pose information of the AR device according to the following method:
and determining the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment and a pre-constructed three-dimensional scene map.
In one possible implementation, the second determining module is configured to determine three-dimensional pose information of the AR device according to the following method:
acquiring data acquired by an inertial measurement unit in the AR equipment, and determining three-dimensional pose information of the AR equipment based on the data acquired by the inertial measurement unit;
the apparatus further comprises a calibration module to:
and calibrating the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment every other preset time.
In a third aspect, an embodiment of the present disclosure further provides a computer device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of the first aspect described above, or any possible implementation of the first aspect.
In a fourth aspect, this disclosed embodiment also provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps in the first aspect or any one of the possible implementation manners of the first aspect.
For the description of the effects of the navigation device, the computer device, and the computer-readable storage medium, reference is made to the description of the navigation method, which is not repeated herein.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
FIG. 1 illustrates a flow chart of a navigation method provided by an embodiment of the present disclosure;
fig. 2 shows a schematic diagram of a three-dimensional virtual arrow displayed in an AR device provided by an embodiment of the present disclosure;
FIG. 3 illustrates a schematic diagram of a turn region of a navigation path provided by an embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating an architecture of a navigation device provided by an embodiment of the present disclosure;
fig. 5 shows a schematic structural diagram of a computer device 500 provided by an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
In the related art, when a user is navigated, the user is generally shown through a two-dimensional indication arrow, which easily causes that the user cannot accurately judge the driving direction, for example, at a certain branch road junction, the user can go up a slope and go straight, and the direction pointed by the two-dimensional indication arrow when indicating the slope and the direction pointed by the two-dimensional indication arrow when indicating the straight are not greatly different, so that the user cannot accurately judge how the user should walk, and the navigation effect is poor.
Based on the research, the disclosure provides a navigation method, a navigation device, computer equipment and a storage medium, when navigating for a user, displaying an AR navigation map containing a three-dimensional virtual arrow and a live-action image for the user through AR equipment, and enriching the display form of navigation; the navigation is carried out through the three-dimensional virtual arrow, the three-dimensional virtual arrow can indicate a plurality of directions, the method for indicating the three-dimensional virtual arrow is more definite, and the problem that a user cannot accurately judge the driving direction is solved.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
To facilitate understanding of the present embodiment, first, a navigation method disclosed in the embodiments of the present disclosure is described in detail, where an execution subject of the navigation method provided in the embodiments of the present disclosure is generally a computer device with certain computing capability, and the computer device includes, for example: terminal device or server or other processing equipment, terminal device can be AR equipment, can include for example that AR glasses, panel computer, smart mobile phone, intelligence wearable device etc. have obvious display function and data processing function's equipment, and AR equipment can pass through application connection cloud end server.
Referring to fig. 1, a flowchart of a navigation method provided in an embodiment of the present disclosure is shown, where the method includes steps 101 to 104, where:
step 101, acquiring a live-action image shot by AR equipment, and determining three-dimensional pose information of the AR equipment.
And 102, acquiring a navigation path of the AR equipment based on the three-dimensional pose information of the AR equipment and the destination information of the AR equipment.
And 103, determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR equipment and the navigation path.
And 104, displaying an AR navigation chart containing the three-dimensional virtual arrow and the live-action image on the AR equipment according to the display posture of the three-dimensional virtual arrow.
The following is a detailed description of the above steps 101 to 104.
For step 101,
The acquiring of the live-action image shot by the AR device may be acquiring of a live-action image in a target scene where the AR device is located, shot by the AR device in real time. When the AR device shoots the live-action image, the AR device can be obtained by calling an image acquisition device of the AR device to shoot in real time after detecting that the AR device starts the navigation application.
In a possible implementation manner, the determining of the three-dimensional pose information of the AR device may be determined by the AR device itself based on a live-action image captured by the AR device, or may be determined by the cloud server based on the live-action image by transmitting the live-action image to the cloud server by the AR device, and then the AR device acquires the determined three-dimensional pose information from the cloud server.
The method for determining the three-dimensional pose information by the AR device may be the same as the method for determining the three-dimensional pose information by the cloud server, and the method for determining the three-dimensional pose information will be described below by taking the AR device as an execution subject for determining the three-dimensional pose information as an example.
The three-dimensional pose information of the AR device may be pose information of the AR device in a three-dimensional coordinate system in the target scene, and the three-dimensional pose information may include position information and orientation.
In a possible implementation manner, when determining the three-dimensional pose information of the AR device, position information of a plurality of target detection points in a target scene corresponding to a live-action image may be detected, a target pixel point corresponding to each target detection point in the live-action image may be determined, then depth information (for example, obtained by performing depth detection on the live-action image) corresponding to each target pixel point in the live-action image may be determined, and then the three-dimensional pose information of the AR device may be determined based on the depth information of the target pixel points.
The target detection point may be a preset position point in a scene where the AR device is located, for example, a cup, a fan, a water dispenser, and the like, and the depth information of the target pixel point may be used to represent a distance between the target detection point corresponding to the target pixel point and an image acquisition device of the AR device. The position coordinates of the target detection points in the scene coordinate system are preset and fixed.
Specifically, when the three-dimensional pose information of the AR equipment is determined, the orientation of a target pixel point corresponding to a target detection point in the scene image can be determined according to the coordinate information of the target pixel point in the scene image; and determining the position information of the AR equipment based on the depth value of the target pixel point corresponding to the target detection point, so that the three-dimensional pose information of the AR equipment can be determined.
In another possible implementation manner, when the three-dimensional pose information of the AR device is determined, a live-action image shot by the AR device may be matched with a three-dimensional model corresponding to a target scene where the AR device is located, and the three-dimensional pose information of the AR device may be determined based on a matching result.
Specifically, when a three-dimensional model corresponding to a target scene is constructed, the three-dimensional model can be constructed based on a panoramic video, and based on the panoramic real scene, area images corresponding to all position areas of the three-dimensional model and corresponding pose information when the area images are shot can be obtained; when the live-action image shot by the AR device is matched with the three-dimensional model, the live-action image shot by the AR device can be matched with the area images corresponding to all position areas of the three-dimensional model, the target area image which is successfully matched and the relative position relationship between the shooting positions of the target area image and the live-action image are determined, and the three-dimensional position information of the AR device when the live-action image is shot can be determined based on the relative position relationship and the position information when the target area image is shot.
In another possible implementation, the three-dimensional pose information of the AR device may be determined by means of an inertial measurement unit IMU inside the AR device. Specifically, data acquired by the inertial measurement unit IMU may be acquired, and then the three-dimensional pose information of the AR device may be determined based on the data acquired by the inertial measurement unit IMU.
Because the three-dimensional pose of the AR equipment is changed in real time, the three-dimensional pose information of the AR equipment is determined based on the data collected by the inertial measurement unit, the three-dimensional pose information of the AR equipment can be updated in real time, and the display posture of the three-dimensional virtual arrow can be further adjusted in real time.
However, in practical applications, data acquired by an Inertial Measurement Unit (IMU) may have errors, and therefore, in order to avoid an influence of a Measurement error of the IMU on three-dimensional pose information of the AR device, the three-dimensional pose information of the AR device may be calibrated based on live-action images captured by the AR device at intervals of a preset time.
Specifically, the three-dimensional pose information of the AR device may be determined based on a live-action image captured by the AR device every preset time period, and then the three-dimensional pose information determined based on the live-action image may be compared with the three-dimensional pose information determined based on the data acquired by the inertial measurement unit IMU, and if not, the three-dimensional pose information determined based on the live-action image may be updated based on the three-dimensional pose information determined based on the data acquired by the inertial measurement unit IMU, that is, the three-dimensional pose information determined based on the live-action image may be used as the current three-dimensional pose information of the AR device.
In another possible implementation manner, when the three-dimensional pose information of the AR device is initially determined, the determination may be performed based on a live-action image captured by the AR device, then, in the process of displaying the AR navigation map, the three-dimensional pose information of the AR device may be determined in real time based on an inertial measurement unit IMU built in the AR device, the three-dimensional pose information of the AR device is re-determined at intervals of a preset time based on the live-action image captured by the AR device, and the three-dimensional pose information of the AR device determined by the inertial measurement unit IMU is updated based on the re-determined three-dimensional pose information of the AR device.
Here, because the measurement accuracy error of the inertial measurement unit is large, the three-dimensional pose of the AR device is calibrated based on the live-action image shot by the AR device, so that the influence of the error caused by the measurement accuracy of the inertial measurement unit can be avoided, and the display accuracy of the three-dimensional virtual arrow display pose is further improved.
With respect to step 102,
In one possible implementation, the destination information of the AR device may be input by the user based on the AR device, and when the navigation path of the AR device is acquired based on the three-dimensional pose information of the AR device and the destination information of the AR device, the user may input the destination information based on the AR device, and then the AR device sends the three-dimensional pose information of the AR device and the destination information of the AR device to the server, and the server determines a navigation path to a target location corresponding to the destination information and sends the navigation path to the AR device, where the navigation path may include the current location of the AR device.
For step 103,
In a possible implementation manner, when the display posture of the three-dimensional virtual arrow is determined based on the three-dimensional pose information and the navigation path of the AR device, a target inflection point closest to the AR device in the navigation path may be determined, and then the display posture of the three-dimensional virtual arrow may be determined based on the three-dimensional pose information and the position information of the target inflection point of the AR device.
Specifically, the navigation path includes an inflection point and an execution segment, and the inflection point may be a position point where the turning angle exceeds a preset angle, for example, a position point where the turning angle exceeds 30 ° may be determined as the inflection point.
It should be noted that the three-dimensional pose information of the AR device includes position information and orientation information of the AR device, and the three-dimensional virtual arrow has six degrees of freedom, for example, if the AR device is a mobile phone, the three-dimensional virtual arrow shown in the AR device may be as shown in fig. 2.
When the display posture of the three-dimensional virtual arrow is determined based on the three-dimensional pose information of the AR device and the position information of the target inflection point, the state variation corresponding to each degree of freedom of the three-dimensional virtual arrow may be determined based on the three-dimensional pose information of the AR device and the position information of the target inflection point, and then the display posture of the three-dimensional virtual arrow may be determined based on the state variation of the three-dimensional virtual arrow in each degree of freedom.
The state variation of the three-dimensional virtual arrow in each degree of freedom respectively represents the movement amount along the x axis, the y axis and the z axis and the rotation amount around the x axis, the y axis and the z axis, and based on the state variation of the three-dimensional virtual arrow in each degree of freedom, the display posture of the three-dimensional virtual arrow pointing to the target inflection point can be determined.
Here, when the three-dimensional virtual arrow is presented based on the determined presentation posture, the three-dimensional virtual arrow always points to the position information corresponding to the target inflection point, and the position information of the target inflection point may be three-dimensional coordinate information whose z-axis area is consistent with the height of the AR device and may vary with the change in the height of the AR device.
In practical application, when the AR device is close to the target inflection point, the display posture of the three-dimensional virtual arrow needs to be adjusted in advance to prompt the user to turn in the direction of the target inflection point.
Specifically, each inflection point in the navigation path corresponds to a turning region, and the turning region corresponding to the current inflection point is a region which is in a straight line from the previous inflection point to the current inflection point and is a preset distance away from the current inflection point.
For example, if the navigation path is as shown in fig. 3, the navigation path includes an inflection point 1, an inflection point 2, an inflection point 3, and an inflection point 4, and the inflection region corresponding to each inflection point is a gray region nearest to the inflection point.
In practical application, the size of the turning area corresponding to each turning point can be the same, or can be adjusted according to the size of the turning angle of the target turning point, for example, if the turning angle corresponding to the target turning point is smaller than a first preset angle, it is indicated that the turning angle corresponding to the target turning point is smaller, so that a user does not need to turn in advance too early, and the size of the area corresponding to the turning area can be a first preset size; if the turning angle corresponding to the target turning point is greater than or equal to a first preset angle and smaller than a second preset angle, the area size of the turning area of the target turning point can be a second preset size; if the target inflection point is greater than or equal to the second preset angle, the size of the region of the turning region of the target inflection point may be a third preset size, the first preset size is smaller than the second preset size and the third preset size, and the second preset size is smaller than the third preset size.
In one possible implementation, when it is detected that the AR device is located in a turning region of the target turning point, when determining the display pose of the three-dimensional virtual arrow, a next turning point corresponding to the target turning point and position information of the next turning point may be determined based on the position information of the target turning point, and then the display pose of the three-dimensional virtual arrow may be determined based on the pose information of the AR device and the position information of the next turning point.
Here, the next inflection point corresponding to the target inflection point may be an inflection point closest to the target inflection point in an area not reached by the AR device in the navigation path.
When the display posture of the three-dimensional virtual arrow is determined based on the posture information of the AR device and the position information of the next inflection point, the state variation corresponding to each degree of freedom of the three-dimensional virtual arrow may be determined based on the three-dimensional posture information of the AR device and the position information of the next inflection point, and then the display posture of the three-dimensional virtual arrow may be determined based on the state variation corresponding to each degree of freedom of the three-dimensional virtual arrow.
Here, when the AR device is located in the turning region of the target turning point, it is described that the AR device is closer to the target turning point, and in this case, the display posture of the three-dimensional virtual arrow is determined based on the position information of the next turning point, and the turning information corresponding to the target turning point may be prompted to the user in advance, so that the user may prepare for turning at the target turning point in advance.
With respect to step 104,
In a possible implementation manner, when displaying the AR navigation map according to the display posture of the three-dimensional virtual arrow, the three-dimensional virtual arrow may be fused with the live-action image captured by the AR device according to the corresponding display posture, where the AR navigation map includes a plurality of continuous live-action images fused with the three-dimensional virtual arrow.
When the three-dimensional virtual arrow is fused with the live-action image shot by the AR device, the three-dimensional virtual arrow can be added at the preset position of the live-action image shot by the AR device according to the determined display posture.
The displaying of the AR navigation map including the three-dimensional virtual arrow and the live-action image on the AR device may be performed after the AR device acquires the AR navigation map including the three-dimensional virtual arrow and the live-action image from the server, and is displayed on the AR device, or may be performed after the server determines the AR navigation map including the three-dimensional virtual arrow and the live-action image, by sending the determined AR navigation map to the AR device, and controlling the AR device to display the AR navigation map.
Based on the method, when the navigation is performed for the user, the AR navigation chart containing the three-dimensional virtual arrow and the live-action image is displayed for the user through the AR equipment, so that the display form of the navigation is enriched; the navigation is carried out through the three-dimensional virtual arrow, the three-dimensional virtual arrow can indicate a plurality of directions, the method for indicating the three-dimensional virtual arrow is more definite, and the problem that a user cannot accurately judge the driving direction is solved.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, a navigation device corresponding to the navigation method is also provided in the embodiments of the present disclosure, and as the principle of solving the problem of the device in the embodiments of the present disclosure is similar to the navigation method in the embodiments of the present disclosure, the implementation of the device may refer to the implementation of the method, and repeated details are not repeated.
Referring to fig. 4, there is shown an architecture diagram of a navigation device according to an embodiment of the present disclosure, the navigation device includes: a first determining module 401, an obtaining module 402, a second determining module 403, and a displaying module 404; wherein the content of the first and second substances,
the first determining module 401 is configured to acquire a live-action image captured by an AR device, and determine three-dimensional pose information of the AR device;
an obtaining module 402, configured to obtain a navigation path of the AR device based on the three-dimensional pose information of the AR device and the destination information of the AR device;
a second determining module 403, configured to determine a display posture of a three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path;
a display module 404, configured to display, on the AR device, an AR navigation map including the three-dimensional virtual arrow and the live-action image according to the display posture of the three-dimensional virtual arrow.
In a possible implementation, the first determining module 401, when acquiring the live-action image captured by the AR device, is configured to:
and after the AR equipment is determined to start the navigation application, calling an image acquisition device of the AR equipment to shoot the live-action image in real time.
In a possible implementation, the second determining module 403, when determining the presentation pose of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path, is configured to:
determining a target inflection point in the navigation path closest to the AR device;
and determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR equipment and the position information of the target inflection point.
In one possible implementation, the three-dimensional pose information of the AR device includes position information and orientation information of the AR device; the three-dimensional virtual arrow has six degrees of freedom;
the second determining module 403, when determining the display pose of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the position information of the target inflection point, is configured to:
determining a state variation corresponding to the three-dimensional virtual arrow in each degree of freedom based on the three-dimensional pose information of the AR device and the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the state variation of the three-dimensional virtual arrow corresponding to each degree of freedom.
In one possible implementation manner, each inflection point in the navigation path corresponds to a turning region, and the turning region of a current inflection point is a region which is away from the current inflection point by a preset distance in a straight line from a last inflection point to the current inflection point;
when the AR device is located in a turning region of the target inflection point, the second determination method is used for determining the display pose of the three-dimensional virtual arrow according to the following method:
determining a next inflection point corresponding to the target inflection point and position information of the next inflection point based on the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the pose information of the AR equipment and the position information of the next inflection point.
In one possible implementation, the three-dimensional pose information of the AR device includes position information and orientation information of the AR device;
the second determining module 403 is configured to determine three-dimensional pose information of the AR device according to the following method:
and determining the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment and a pre-constructed three-dimensional scene map.
In a possible implementation, the second determining module 403 is configured to determine three-dimensional pose information of the AR device according to the following method:
acquiring data acquired by an inertial measurement unit in the AR equipment, and determining three-dimensional pose information of the AR equipment based on the data acquired by the inertial measurement unit;
the apparatus further comprises a calibration module 405, the calibration module 405 configured to:
and calibrating the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment every other preset time.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
Based on the device, when the navigation is performed for the user, the AR navigation chart containing the three-dimensional virtual arrow and the live-action image is displayed for the user through the AR equipment, so that the display form of the navigation is enriched; the navigation is carried out through the three-dimensional virtual arrow, the three-dimensional virtual arrow can indicate a plurality of directions, the method for indicating the three-dimensional virtual arrow is more definite, and the problem that a user cannot accurately judge the driving direction is solved.
Based on the same technical concept, the embodiment of the disclosure also provides computer equipment. Referring to fig. 5, a schematic structural diagram of a computer device 500 provided in the embodiment of the present disclosure includes a processor 501, a memory 502, and a bus 503. The memory 502 is used for storing execution instructions and includes a memory 5021 and an external memory 5022; the memory 5021 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 501 and data exchanged with an external storage 5022 such as a hard disk, the processor 501 exchanges data with the external storage 5022 through the memory 5021, and when the computer device 500 operates, the processor 501 communicates with the storage 502 through the bus 503, so that the processor 501 executes the following instructions:
acquiring a live-action image shot by AR equipment, and determining three-dimensional pose information of the AR equipment;
acquiring a navigation path of the AR equipment based on the three-dimensional pose information of the AR equipment and the destination information of the AR equipment;
determining a display posture of a three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path;
and displaying an AR navigation map containing the three-dimensional virtual arrow and the live-action image on the AR equipment according to the display posture of the three-dimensional virtual arrow.
In a possible implementation, the instructions executed by the processor 501 for obtaining the live-action image taken by the AR device include:
and after the AR equipment is determined to start the navigation application, calling an image acquisition device of the AR equipment to shoot the live-action image in real time.
In a possible implementation, the processor 501 executes instructions to determine a display posture of a three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path, including:
determining a target inflection point in the navigation path closest to the AR device;
and determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR equipment and the position information of the target inflection point.
In one possible implementation, processor 501 executes instructions in which the three-dimensional pose information of the AR device includes position information and orientation information of the AR device; the three-dimensional virtual arrow has six degrees of freedom;
the determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the position information of the target inflection point comprises:
determining a state variation corresponding to the three-dimensional virtual arrow in each degree of freedom based on the three-dimensional pose information of the AR device and the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the state variation of the three-dimensional virtual arrow corresponding to each degree of freedom.
In a possible implementation manner, in the instruction executed by the processor 501, each inflection point in the navigation path corresponds to a turning region, and the turning region of a current inflection point is a region which is in a straight line from a last inflection point to the current inflection point and is a preset distance away from the current inflection point;
when the AR device is located in a turning area of the target turning point, determining a display posture of the three-dimensional virtual arrow according to the following method:
determining a next inflection point corresponding to the target inflection point and position information of the next inflection point based on the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the pose information of the AR equipment and the position information of the next inflection point.
In one possible implementation, processor 501 executes instructions in which the three-dimensional pose information of the AR device includes position information and orientation information of the AR device;
determining three-dimensional pose information of the AR device according to:
and determining the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment and a pre-constructed three-dimensional scene map.
In one possible implementation, processor 501 executes instructions that determine three-dimensional pose information for the AR device according to the following method:
acquiring data acquired by an inertial measurement unit in the AR equipment, and determining three-dimensional pose information of the AR equipment based on the data acquired by the inertial measurement unit;
the method further comprises the following steps:
and calibrating the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment every other preset time.
The embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the navigation method described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the navigation method provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the navigation method described in the above method embodiments, which may be referred to specifically in the above method embodiments, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A navigation method, comprising:
acquiring a live-action image shot by AR equipment, and determining three-dimensional pose information of the AR equipment;
acquiring a navigation path of the AR equipment based on the three-dimensional pose information of the AR equipment and the destination information of the AR equipment;
determining a display posture of a three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path;
and displaying an AR navigation map containing the three-dimensional virtual arrow and the live-action image on the AR equipment according to the display posture of the three-dimensional virtual arrow.
2. The method of claim 1, wherein the obtaining of the live-action image taken by the AR device comprises:
and after the AR equipment is determined to start the navigation application, calling an image acquisition device of the AR equipment to shoot the live-action image in real time.
3. The method according to claim 1 or 2, wherein the determining the presentation pose of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the navigation path comprises:
determining a target inflection point in the navigation path closest to the AR device;
and determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR equipment and the position information of the target inflection point.
4. The method of claim 3, wherein the three-dimensional pose information of the AR device comprises position information and orientation information of the AR device; the three-dimensional virtual arrow has six degrees of freedom;
the determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR device and the position information of the target inflection point comprises:
determining a state variation corresponding to the three-dimensional virtual arrow in each degree of freedom based on the three-dimensional pose information of the AR device and the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the state variation of the three-dimensional virtual arrow corresponding to each degree of freedom.
5. The method according to claim 3, wherein each inflection point in the navigation path corresponds to a turning region, and the turning region of a current inflection point is a region which is away from the current inflection point by a preset distance in a straight line from a last inflection point to the current inflection point;
when the AR device is located in a turning area of the target turning point, determining a display posture of the three-dimensional virtual arrow according to the following method:
determining a next inflection point corresponding to the target inflection point and position information of the next inflection point based on the position information of the target inflection point;
and determining the display posture of the three-dimensional virtual arrow based on the pose information of the AR equipment and the position information of the next inflection point.
6. The method according to any one of claims 1 to 5, wherein the three-dimensional pose information of the AR device comprises position information and orientation information of the AR device;
determining three-dimensional pose information of the AR device according to:
and determining the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment and a pre-constructed three-dimensional scene map.
7. The method of claim 1, wherein the three-dimensional pose information of the AR device is determined according to the following method:
acquiring data acquired by an inertial measurement unit in the AR equipment, and determining three-dimensional pose information of the AR equipment based on the data acquired by the inertial measurement unit;
the method further comprises the following steps:
and calibrating the three-dimensional pose information of the AR equipment based on the live-action image shot by the AR equipment every other preset time.
8. A navigation device, comprising:
the first determining module is used for acquiring a live-action image shot by the AR equipment and determining three-dimensional pose information of the AR equipment;
the acquisition module is used for acquiring a navigation path of the AR equipment based on the three-dimensional pose information of the AR equipment and the destination information of the AR equipment;
the second determination module is used for determining the display posture of the three-dimensional virtual arrow based on the three-dimensional pose information of the AR equipment and the navigation path;
and the display module is used for displaying the AR navigation chart containing the three-dimensional virtual arrow and the live-action image on the AR equipment according to the display posture of the three-dimensional virtual arrow.
9. A computer device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when a computer device is run, the machine-readable instructions when executed by the processor performing the steps of the navigation method of any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, performs the steps of the navigation method according to any one of claims 1 to 7.
CN202011549889.0A 2020-12-24 2020-12-24 Navigation method, navigation device, computer equipment and storage medium Active CN112729327B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011549889.0A CN112729327B (en) 2020-12-24 2020-12-24 Navigation method, navigation device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011549889.0A CN112729327B (en) 2020-12-24 2020-12-24 Navigation method, navigation device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112729327A true CN112729327A (en) 2021-04-30
CN112729327B CN112729327B (en) 2024-06-07

Family

ID=75605923

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011549889.0A Active CN112729327B (en) 2020-12-24 2020-12-24 Navigation method, navigation device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112729327B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113345013A (en) * 2021-06-29 2021-09-03 视伴科技(北京)有限公司 Method and device for generating way directing identification information
CN113566846A (en) * 2021-07-22 2021-10-29 北京百度网讯科技有限公司 Navigation calibration method and device, electronic equipment and computer readable medium
CN113776553A (en) * 2021-08-31 2021-12-10 深圳市慧鲤科技有限公司 AR data display method and device, electronic equipment and storage medium
WO2022257698A1 (en) * 2021-06-11 2022-12-15 腾讯科技(深圳)有限公司 Electronic map-based interaction method and apparatus, computer device, and storage medium
WO2023131089A1 (en) * 2022-01-06 2023-07-13 华为技术有限公司 Augmented reality system, augmented reality scenario positioning method, and device
TWI826189B (en) * 2022-12-16 2023-12-11 仁寶電腦工業股份有限公司 Controller tracking system and method with six degrees of freedom
WO2023246530A1 (en) * 2022-06-20 2023-12-28 中兴通讯股份有限公司 Ar navigation method, and terminal and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204614232U (en) * 2015-05-12 2015-09-02 南京信息工程大学 Based on the three-dimensional path guidance system of mobile client and Quick Response Code
CN107256017A (en) * 2017-04-28 2017-10-17 中国农业大学 route planning method and system
CN108917758A (en) * 2018-02-24 2018-11-30 石化盈科信息技术有限责任公司 A kind of navigation methods and systems based on AR
US20190101407A1 (en) * 2017-09-29 2019-04-04 Beijing Kingsoft Internet Security Software Co., Ltd. Navigation method and device based on augmented reality, and electronic device
CN111627114A (en) * 2020-04-14 2020-09-04 北京迈格威科技有限公司 Indoor visual navigation method, device and system and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204614232U (en) * 2015-05-12 2015-09-02 南京信息工程大学 Based on the three-dimensional path guidance system of mobile client and Quick Response Code
CN107256017A (en) * 2017-04-28 2017-10-17 中国农业大学 route planning method and system
US20190101407A1 (en) * 2017-09-29 2019-04-04 Beijing Kingsoft Internet Security Software Co., Ltd. Navigation method and device based on augmented reality, and electronic device
CN108917758A (en) * 2018-02-24 2018-11-30 石化盈科信息技术有限责任公司 A kind of navigation methods and systems based on AR
CN111627114A (en) * 2020-04-14 2020-09-04 北京迈格威科技有限公司 Indoor visual navigation method, device and system and electronic equipment

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022257698A1 (en) * 2021-06-11 2022-12-15 腾讯科技(深圳)有限公司 Electronic map-based interaction method and apparatus, computer device, and storage medium
CN113345013A (en) * 2021-06-29 2021-09-03 视伴科技(北京)有限公司 Method and device for generating way directing identification information
CN113566846A (en) * 2021-07-22 2021-10-29 北京百度网讯科技有限公司 Navigation calibration method and device, electronic equipment and computer readable medium
CN113566846B (en) * 2021-07-22 2022-11-04 北京百度网讯科技有限公司 Navigation calibration method and device, electronic equipment and computer readable medium
CN113776553A (en) * 2021-08-31 2021-12-10 深圳市慧鲤科技有限公司 AR data display method and device, electronic equipment and storage medium
WO2023131089A1 (en) * 2022-01-06 2023-07-13 华为技术有限公司 Augmented reality system, augmented reality scenario positioning method, and device
WO2023246530A1 (en) * 2022-06-20 2023-12-28 中兴通讯股份有限公司 Ar navigation method, and terminal and storage medium
TWI826189B (en) * 2022-12-16 2023-12-11 仁寶電腦工業股份有限公司 Controller tracking system and method with six degrees of freedom

Also Published As

Publication number Publication date
CN112729327B (en) 2024-06-07

Similar Documents

Publication Publication Date Title
CN112729327A (en) Navigation method, navigation device, computer equipment and storage medium
KR102414587B1 (en) Augmented reality data presentation method, apparatus, device and storage medium
CN110478901B (en) Interaction method and system based on augmented reality equipment
CN112146649B (en) Navigation method and device in AR scene, computer equipment and storage medium
EP2750110A1 (en) Information processing device, information processing method, and program
US11880956B2 (en) Image processing method and apparatus, and computer storage medium
CN108810473B (en) Method and system for realizing GPS mapping camera picture coordinate on mobile platform
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
US9144744B2 (en) Locating and orienting device in space
KR20150082358A (en) Reference coordinate system determination
CN112179331B (en) AR navigation method, AR navigation device, electronic equipment and storage medium
CN110276774B (en) Object drawing method, device, terminal and computer-readable storage medium
CN112771576A (en) Position information acquisition method, device and storage medium
KR20150114141A (en) System and method for motion estimation
CN111698646B (en) Positioning method and device
KR20210131414A (en) Interactive object driving method, apparatus, device and recording medium
CN111651051A (en) Virtual sand table display method and device
EP3275182B1 (en) Methods and systems for light field augmented reality/virtual reality on mobile devices
CN114202640A (en) Data acquisition method and device, computer equipment and storage medium
KR20190011492A (en) Device for providing content and method of operating the same
CN113390408A (en) Robot positioning method and device, robot and storage medium
CN112882576A (en) AR interaction method and device, electronic equipment and storage medium
CN111651052A (en) Virtual sand table display method and device, electronic equipment and storage medium
JP5518677B2 (en) Virtual information giving apparatus and virtual information giving program
CN111815783A (en) Virtual scene presenting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant