CN112181141A - AR positioning method, AR positioning device, electronic equipment and storage medium - Google Patents

AR positioning method, AR positioning device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112181141A
CN112181141A CN202011012903.3A CN202011012903A CN112181141A CN 112181141 A CN112181141 A CN 112181141A CN 202011012903 A CN202011012903 A CN 202011012903A CN 112181141 A CN112181141 A CN 112181141A
Authority
CN
China
Prior art keywords
positioning
equipment
pose
data
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011012903.3A
Other languages
Chinese (zh)
Other versions
CN112181141B (en
Inventor
侯欣如
欧华富
周玉杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202011012903.3A priority Critical patent/CN112181141B/en
Publication of CN112181141A publication Critical patent/CN112181141A/en
Application granted granted Critical
Publication of CN112181141B publication Critical patent/CN112181141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames

Abstract

The present disclosure provides an AR positioning method, an AR positioning apparatus, an electronic device, and a storage medium, where the method includes: acquiring a real scene image shot by AR equipment; based on the real scene image, acquiring an initial positioning pose of the AR equipment and monitoring state data of the AR equipment; and under the condition that the AR equipment meets the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.

Description

AR positioning method, AR positioning device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of augmented reality technologies, and in particular, to an AR positioning method, an AR positioning apparatus, an electronic device, and a storage medium.
Background
Generally, Augmented Reality (AR) technology is a technology for fusing virtual information with a real world, and the technology can apply virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer to the real world after analog simulation, so as to enhance the real world, that is, to present virtual objects in the real world.
In an AR scene, AR content needs to be presented based on the positioning pose of the AR device, so the AR content presentation is closely related to the positioning pose of the AR device.
Disclosure of Invention
In view of the above, the present disclosure provides at least a method, an apparatus, an electronic device and a storage medium for AR positioning.
In a first aspect, the present disclosure provides an AR positioning method, including:
acquiring a real scene image shot by AR equipment;
based on the real scene image, acquiring an initial positioning pose of the AR equipment and monitoring state data of the AR equipment;
and under the condition that the AR equipment meets the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.
In the method, the initial positioning pose of the AR device is acquired based on the acquired real scene image shot by the AR device, the state data of the AR device is monitored, and when the AR device is determined to meet the preset state condition based on the monitored state data, the real-time positioning pose of the AR device can be determined based on the initial positioning pose and the positioning data output by the positioning component of the AR device. Because the accuracy of the initial positioning pose acquired based on the real scene image is high, when the AR equipment is determined to meet the preset state condition, namely the determined real-time positioning pose is ensured to be accurate, the efficiency of determining the real-time positioning pose of the AR equipment can be improved when the real-time positioning pose of the AR equipment is determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.
In a possible embodiment, the method further comprises:
and acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment under the condition that the AR equipment does not meet the preset state condition based on the monitored state data.
By adopting the method, when the AR equipment is determined not to meet the preset state condition based on the monitored state data, the real-time positioning pose of the AR equipment cannot be accurately determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment, so that the real-time positioning pose of the AR equipment can be accurately acquired based on the real scene image shot by the AR equipment.
In a possible implementation, determining that the AR device satisfies a preset state condition based on the monitored state data includes:
determining that the AR device meets a preset state condition under the condition that the monitored state data comprise displacement data and rotation angle data of the AR device, and the variation of the displacement data and the variation of the rotation angle data of the AR device in a preset time are determined to be in a preset variation range; and/or the presence of a gas in the gas,
determining that the AR device satisfies the preset state condition in a case where it is determined that the monitored state data includes displacement data at preset N directional axes, and/or rotation angle data at M rotation angles, where N and M are positive integers greater than 1.
Here, there is an accurate judgment of the state of the AR device by the preset state condition set as described above. For example, whether the AR device meets the stability condition may be determined through the monitored state data, that is, whether the variation of the displacement data and the variation of the rotation angle data of the AR device within the preset time are within the preset variation range is determined, and if so, it is determined that the AR device meets the preset state condition, that is, the AR device meets the stability condition. For another example, it may also be determined whether the state data is lost through the monitored state data, that is, it is determined whether the monitored state data includes displacement data of N preset directional axes and/or rotation angle data of M rotation angles, and if the monitored state data includes the displacement data of N preset directional axes and/or the rotation angle data of M preset rotation angles, it is determined that the AR device satisfies the preset state condition, that is, the state data is not lost.
In a possible embodiment, the method further comprises:
displaying, by the AR device, an AR prompt element indicating a non-target activity area if it is determined that the AR device is located outside a target activity area based on the real-time positioning pose of the AR device.
In the above embodiment, when the AR device is determined to be located outside the target activity area based on the real-time positioning pose of the AR device, the AR prompt element may be displayed by the AR device, and the user may be visually and clearly prompted to be located in the non-target activity area by the displayed AR prompt element.
In one possible embodiment, the AR hint element indicating the non-target active area includes: and (5) masking effect.
In a possible embodiment, the method further comprises:
and displaying failure reminding information through the AR equipment after the initial positioning pose is failed to be acquired.
In a possible implementation manner, after the initial positioning pose is failed to be acquired, displaying failure reminding information through the AR device includes:
if the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, displaying first failure reminding information through the AR equipment;
and if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
In the above embodiment, after the initial positioning pose is failed to be acquired, different failure reminding information may be displayed according to different failure reasons, for example, if the initial positioning pose is failed to be acquired due to the current network connection failure, the first failure reminding information may be displayed; if the current network connection of the AR equipment is successful, but the initial positioning pose is failed to acquire, the second failure reminding information can be displayed, so that a user can perform different processing according to different failure reminding information, and the initial positioning pose of the AR equipment can be acquired after different processing.
In a possible embodiment, the method further comprises: displaying positioning reminding information through the AR equipment in the process of acquiring the initial positioning pose of the AR equipment based on the real scene image; wherein the positioning reminding information comprises: the information includes reminder information indicating that positioning is being performed, and/or information indicating photographing constraint conditions at the time of positioning.
By adopting the method, the positioning reminding information can be reminding information for indicating the positioning, and a visual display effect can be provided for the AR positioning process by displaying the positioning reminding information. The positioning reminding information can also be information indicating shooting constraint conditions during positioning, and a user can operate according to the displayed positioning reminding information by displaying the positioning reminding information, so that the efficiency of obtaining the initial positioning pose of the AR equipment is improved.
In one possible embodiment, determining the real-time positioning pose of the AR device based on the initial positioning pose and the positioning data output by the positioning component of the AR device comprises:
converting the initial positioning pose to a virtual world coordinate system to generate a coordinate-converted initial positioning pose;
and determining pose information of the AR equipment in a virtual world coordinate system based on the initial positioning pose after the coordinate conversion and the positioning data output by the positioning component of the AR equipment, and determining the pose information as the real-time positioning pose of the AR equipment.
By adopting the method, the initial positioning pose can be converted into the virtual world coordinate system, the initial positioning pose after coordinate conversion is generated, the pose information of the AR equipment in the virtual world coordinate system is determined based on the initial positioning pose after coordinate conversion and the positioning data output by the positioning component of the AR equipment, the pose information is determined as the real-time positioning pose of the AR equipment, the pose information obtained in the positioning process is the pose in the virtual world coordinate system, and the determined real-time positioning pose is more accurate.
The following descriptions of the effects of the apparatus, the electronic device, and the like refer to the description of the above method, and are not repeated here.
In a second aspect, the present disclosure provides an AR positioning apparatus, comprising:
the acquisition module is used for acquiring a real scene image shot by the AR equipment;
the monitoring module is used for monitoring the state data of the AR equipment after the initial positioning pose of the AR equipment is successfully acquired based on the real scene image;
the first determining module is used for determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component in the AR equipment under the condition that the AR equipment meets the preset state condition based on the monitored state data.
In a possible embodiment, the apparatus further comprises:
and the second determining module is used for acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment under the condition that the AR equipment is determined not to meet the preset state condition based on the monitored state data.
In a possible implementation manner, the first determining module, when determining that the AR device satisfies a preset state condition based on the monitored state data, is configured to:
determining that the AR device meets a preset state condition under the condition that the monitored state data comprise displacement data and rotation angle data of the AR device, and the variation of the displacement data and the variation of the rotation angle data of the AR device in a preset time are determined to be in a preset variation range; and/or the presence of a gas in the gas,
determining that the AR device satisfies the preset state condition in a case where it is determined that the monitored state data includes displacement data at preset N directional axes, and/or rotation angle data at M rotation angles, where N and M are positive integers greater than 1.
In a possible embodiment, the apparatus further comprises:
a first presentation module to present, by the AR device, an AR hint element indicating a non-target activity area if the AR device is determined to be outside of a target activity area based on the real-time positioning pose of the AR device.
In one possible embodiment, the AR hint element indicating the non-target active area includes: and (5) masking effect.
In a possible embodiment, the apparatus further comprises:
and the second display module is used for displaying failure reminding information through the AR equipment after the initial positioning pose is failed to be obtained.
In a possible implementation manner, when the failure prompt information is displayed by the AR device after the initial positioning pose is failed to be acquired, the second display module is configured to:
if the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, displaying first failure reminding information through the AR equipment;
and if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
In a possible embodiment, the apparatus further comprises:
the third display module is used for displaying positioning reminding information through the AR equipment in the process of acquiring the initial positioning pose of the AR equipment based on the real scene image; wherein the positioning reminding information comprises: the information includes reminder information indicating that positioning is being performed, and/or information indicating photographing constraint conditions at the time of positioning.
In one possible embodiment, the first determining module, when determining the real-time positioning pose of the AR device based on the initial positioning pose and the positioning data output by the positioning component of the AR device, is configured to:
converting the initial positioning pose to a virtual world coordinate system to generate a coordinate-converted initial positioning pose;
and determining pose information of the AR equipment in a virtual world coordinate system based on the initial positioning pose after the coordinate conversion and the positioning data output by the positioning component of the AR equipment, and determining the pose information as the real-time positioning pose of the AR equipment.
In a third aspect, the present disclosure provides an electronic device comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of AR positioning as described in the first aspect or any embodiment above.
In a fourth aspect, the present disclosure provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of AR positioning as described in the first aspect or any one of the embodiments above.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 is a schematic flowchart illustrating an AR positioning method according to an embodiment of the present disclosure;
fig. 2a shows an interface schematic diagram of an AR device in a method for AR positioning according to an embodiment of the present disclosure;
fig. 2b is a schematic interface diagram of an AR device in another AR positioning method provided in the embodiment of the present disclosure;
fig. 3a shows an interface schematic diagram of an AR device in a method for AR positioning according to an embodiment of the present disclosure;
fig. 3b is a schematic interface diagram of an AR device in another AR positioning method provided in the embodiment of the present disclosure;
fig. 4a shows an interface schematic diagram of an AR device in a method for AR positioning provided by an embodiment of the present disclosure;
fig. 4b is a schematic interface diagram of an AR device in another AR positioning method provided in the embodiment of the present disclosure;
fig. 5a is a schematic flow chart illustrating manual positioning in a method for AR positioning according to an embodiment of the present disclosure;
fig. 5b is a schematic flowchart illustrating automatic positioning in a method for AR positioning according to an embodiment of the present disclosure;
fig. 5c is a schematic flowchart illustrating listening positioning in a method for AR positioning according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram illustrating an architecture of an AR positioning apparatus provided in an embodiment of the present disclosure;
fig. 7 shows a schematic structural diagram of an electronic device 700 provided in an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
Generally, Augmented Reality (AR) technology is a technology for fusing virtual information with a real world, and the technology can apply virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer to the real world after analog simulation, so as to enhance the real world, that is, to present virtual objects in the real world.
In an AR scene, AR content needs to be presented based on the positioning pose of the AR device, so the AR content presentation is closely related to the positioning pose of the AR device. In order to improve the accuracy of positioning, the embodiments of the present disclosure provide an AR positioning method.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
The technical solutions in the present disclosure will be described clearly and completely with reference to the accompanying drawings in the present disclosure, and it is to be understood that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. The components of the present disclosure, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
For the understanding of the embodiments of the present disclosure, a detailed description will be given to an AR positioning method disclosed in the embodiments of the present disclosure. The execution subject of the AR positioning method provided by the embodiments of the present disclosure is generally a computer device with certain computing capability, and the computer device includes: a terminal device, which may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle-mounted device, a wearable device, an Augmented Reality (AR) device, or a server or other processing device. In some possible implementations, the method of AR positioning may be implemented by a processor calling computer readable instructions stored in a memory.
Referring to fig. 1, a schematic flow chart of a method for AR positioning according to an embodiment of the present disclosure is shown, where the method includes: S101-S103, wherein:
and S101, acquiring a real scene image shot by the AR equipment.
And S102, acquiring an initial positioning pose of the AR equipment and monitoring state data of the AR equipment based on the real scene image.
S103, under the condition that the AR equipment meets the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.
In the method, the initial positioning pose of the AR device is acquired based on the acquired real scene image shot by the AR device, the state data of the AR device is monitored, and when the AR device is determined to meet the preset state condition based on the monitored state data, the real-time positioning pose of the AR device can be determined based on the initial positioning pose and the positioning data output by the positioning component of the AR device. Because the accuracy of the initial positioning pose acquired based on the real scene image is high, when the AR equipment is determined to meet the preset state condition, namely the determined real-time positioning pose is ensured to be accurate, the efficiency of determining the real-time positioning pose of the AR equipment can be improved when the real-time positioning pose of the AR equipment is determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.
For S101 and for S102:
in the embodiment of the present disclosure, the AR device is an intelligent device capable of supporting an AR function, and by way of example, the AR device includes but is not limited to: the mobile phone, the tablet computer, the AR glasses and other electronic equipment capable of presenting the augmented reality effect.
Here, the real scene image may be an image captured by the AR device in real time, and the real scene image may be a grayscale image or a color image; meanwhile, the real scene image may also be an image containing depth information or the like.
During specific implementation, a real scene image of the AR device may be acquired, an initial positioning pose of the AR device may be acquired based on the real scene image, and meanwhile, state data of the AR device may also be monitored, for example, the state data may be translation data and rotation data, that is, data of 6 degrees of freedom dof.
In specific implementation, based on the real scene image, the process of determining the initial positioning pose of the AR device may be: extracting a plurality of feature point information included in the real scene image, matching the plurality of feature point information included in the extracted real scene image with the constructed three-dimensional scene map, and determining the initial positioning pose corresponding to the AR equipment.
The three-dimensional scene map may be constructed according to the following steps: acquiring a video corresponding to a real scene, and sampling from the video to obtain a multi-frame scene image sample, or directly acquiring the multi-frame scene image sample corresponding to the real scene, and extracting a plurality of sample feature point information from the multi-frame scene image sample by using a neural network algorithm; and then, a three-dimensional scene map can be constructed based on the extracted information of the plurality of sample characteristic points. The multi-frame scene image samples may be scene image samples of different angles in the real scene.
For S103:
in an alternative embodiment, the method further comprises: and acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment under the condition that the AR equipment does not meet the preset state condition based on the monitored state data.
Whether the AR equipment meets a preset state condition or not can be judged based on the monitored state data, and if yes, the real-time positioning pose of the AR equipment is determined based on the initial positioning pose and positioning data output by a positioning component of the AR equipment; and if not, acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment.
By adopting the method, when the AR equipment is determined not to meet the preset state condition based on the monitored state data, the real-time positioning pose of the AR equipment cannot be accurately determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment, so that the real-time positioning pose of the AR equipment can be accurately acquired based on the real scene image shot by the AR equipment.
Wherein, based on the monitored status data, determining that the AR device satisfies a preset status condition includes:
the first condition is that the AR device is determined to meet a preset state condition when the monitored state data comprises displacement data and rotation angle data of the AR device, and the variation of the displacement data and the variation of the rotation angle data of the AR device in a preset time are determined to be in a preset variation range.
And secondly, under the condition that the monitored state data comprise displacement data of preset N directional axes and/or rotation angle data of M rotation angles, determining that the AR equipment meets the preset state condition, wherein N and M are positive integers larger than 1.
If the AR equipment is judged to meet the first condition based on the monitored state data, the AR equipment is determined to meet the preset state condition; or, if the AR equipment is judged to meet the second condition based on the monitored state data, determining that the AR equipment meets the preset state condition; or, if the AR equipment is judged to meet the first condition and the second condition based on the monitored state data, determining that the AR equipment meets the preset state condition. Wherein the intercepted status data may include displacement data and/or rotation angle data of the AR device. The displacement data may include data in x-axis, y-axis, and z-axis directions, respectively; the rotation angle data may include pitch rotation angle (pitch), roll rotation angle (roll), and yaw rotation angle (yaw).
For the condition one, when the monitored state data includes displacement data and rotation angle data of the AR device, judging whether the variation of the displacement data of the AR device within a preset time is within a preset variation range, and judging whether the variation of the rotation angle data of the AR device within the preset time is within the preset variation range, if the variation of the displacement data and the variation of the rotation angle data are both smaller than the preset variation range, determining that the AR device satisfies a preset state condition, that is, determining that the AR device is in a stable state; and if the variation of the displacement data and/or the variation of the rotation angle data is larger than or equal to a preset variation range, determining that the AR equipment does not meet the preset state condition, namely determining that the AR equipment is in an unstable state. The variation range corresponding to the displacement data and the variation range corresponding to the rotation angle data may be set according to actual conditions.
For the second condition, whether the monitored state data is lost or not can be judged, and if the monitored state data is lost, the AR equipment is determined not to meet the preset state condition; and if the loss does not exist, determining that the AR equipment meets the preset state condition. In specific implementation, if the state data includes displacement data of preset N directional axes and rotation angle data of M rotation angles, determining a type i of the displacement data and a type j of the rotation angle data included in the monitored state data, and if i is less than N and/or j is less than M, determining that the monitored state data is lost, that is, determining that the AR device does not satisfy the preset state condition; and if i is equal to N and j is equal to M, determining that the monitored state data is not lost, namely determining that the AR equipment meets the preset state condition. Wherein N and M are positive integers greater than 1; i and j are positive integers.
Here, there is an accurate judgment of the state of the AR device by the preset state condition set as described above. For example, whether the AR device meets the stability condition may be determined through the monitored state data, that is, whether the variation of the displacement data and the variation of the rotation angle data of the AR device within the preset time are within the preset variation range is determined, and if so, it is determined that the AR device meets the preset state condition, that is, the AR device meets the stability condition. For another example, it may also be determined whether the state data is lost through the monitored state data, that is, it is determined whether the monitored state data includes displacement data of N preset directional axes and/or rotation angle data of M rotation angles, and if the monitored state data includes the displacement data of N preset directional axes and/or the rotation angle data of M preset rotation angles, it is determined that the AR device satisfies the preset state condition, that is, the state data is not lost.
Determining a real-time positioning pose of the AR device based on the initial positioning pose and positioning data output by a positioning component of the AR device under the condition that the AR device is determined to meet a preset state condition based on the monitored state data, comprising:
and S1031, converting the initial positioning pose to a virtual world coordinate system, and generating the initial positioning pose after coordinate conversion.
And S1032, determining the pose information of the AR equipment in a virtual world coordinate system based on the initial positioning pose after the coordinate conversion and the positioning data output by the positioning component of the AR equipment, and determining the pose information as the real-time positioning pose of the AR equipment.
Here, after an initial positioning pose of the AR device is acquired based on the real scene image, the initial positioning pose being pose information in a coordinate system corresponding to the three-dimensional scene map, the initial positioning pose may be converted into a virtual world coordinate system, and an initial positioning pose after coordinate conversion may be generated.
For example, a coordinate transformation matrix between a coordinate system corresponding to the three-dimensional scene map and a virtual world coordinate system may be determined, and the initial positioning pose may be adjusted by using the coordinate transformation matrix to generate the initial positioning pose after coordinate transformation.
After the initial positioning pose conversion value is in the virtual world coordinate system, the initial positioning pose can be tracked by using positioning data output by a positioning component of the AR device, pose information of the AR device in the virtual world coordinate system is determined, and the pose information is determined as the real-time positioning pose of the AR device. In practice, the positioning data output by the positioning component of the AR device may be a moving distance, a rotating angle, and the like in a certain direction, and the initial positioning pose after coordinate transformation is calculated by using the moving distance and the rotating angle indicated by the positioning data in the certain direction, so as to determine the pose information of the AR device in the virtual world coordinate system, that is, the real-time positioning pose of the AR device is obtained. For example, if the position coordinate indicated by the initial positioning pose after the conversion is {1,1,1}, the AR device indicated by the positioning data moves 1 meter along the x-axis direction, and the position coordinate indicated by the pose information of the AR device in the virtual world coordinate system is determined to be {2,1,1 }.
Or after the initial positioning pose conversion value is in the virtual world coordinate system, tracking the initial positioning pose by using a simultaneous positioning and mapping (SLAM) technology, determining pose information of the AR device in the virtual world coordinate system, and determining the pose information as the real-time positioning pose of the AR device. In specific implementation, the target initial positioning pose can be generated in a coordinate system corresponding to the initial positioning pose conversion value SLAM after the conversion of the scaling, the AR equipment is tracked according to the SLAM technology, the intermediate pose information of the AR equipment in the coordinate system corresponding to the SLAM is determined, and the intermediate pose information is converted into a virtual world coordinate system to obtain the real-time positioning pose of the AR equipment.
And under the condition that the AR equipment is determined not to meet the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment and the constructed three-dimensional scene map.
By adopting the method, the initial positioning pose can be converted into the virtual world coordinate system, the initial positioning pose after coordinate conversion is generated, the pose information of the AR equipment in the virtual world coordinate system is determined based on the initial positioning pose after coordinate conversion and the positioning data output by the positioning component of the AR equipment, the pose information is determined as the real-time positioning pose of the AR equipment, the pose information obtained in the positioning process is the pose in the virtual world coordinate system, and the determined real-time positioning pose is more accurate.
In an alternative embodiment, the method further comprises: displaying, by the AR device, an AR prompt element indicating a non-target activity area if it is determined that the AR device is located outside a target activity area based on the real-time positioning pose of the AR device. Wherein the AR hint element indicating the non-target active area comprises: and (5) masking effect.
After the real-time positioning pose of the AR equipment is obtained, monitoring whether the AR equipment is located in a target activity area, if so, not displaying an AR prompt element, and continuously monitoring whether the AR equipment is located in the target activity area; if not, namely the AR equipment is determined to be outside the target activity area, displaying an AR prompt element indicating the non-target activity area through the AR equipment. Wherein the AR hint element indicating the non-target active area comprises: the masking effect is to add a masking effect on the display frame in the non-target active region, for example, a layer with a preset color may be added on the display frame in the non-target active region. In an alternative embodiment, the AR hint element indicating the non-target active area may further include: and a popup prompting effect, namely displaying prompting information on a display picture in a popup mode to prompt a user to enter a non-target activity area.
In the above embodiment, when the AR device is determined to be located outside the target activity area based on the real-time positioning pose of the AR device, the AR prompt element may be displayed by the AR device, and the user may be visually and clearly prompted to be located in the non-target activity area by the displayed AR prompt element.
Referring to fig. 2a, in a method for positioning an AR, an interface diagram of an AR device may include basic active task buttons located at the upper left, such as buttons corresponding to "level 1", "level 2", "level 3", and "level 4", respectively. By triggering the button corresponding to the level, the target activity area corresponding to the level and the non-target activity area except the target activity area can be determined. An AR prompt element 21 indicating a non-target active area is also shown in the interface schematic diagram, wherein the AR prompt element indicating the non-target active area is a masking effect. Referring to fig. 2b, in another AR positioning method, an interface schematic diagram of the AR device is also shown, where the interface schematic diagram also shows an AR prompt element 21 indicating a non-target active area, where the AR prompt element indicating the non-target active area is a pop-up prompt effect, and "enter the non-target active area, please note" in fig. 2b is the AR prompt element indicating the non-target active area.
In an alternative embodiment, the method further comprises: and displaying failure reminding information through the AR equipment after the initial positioning pose is failed to be acquired.
In specific implementation, a situation of positioning failure may exist in a positioning process, that is, a situation of failure in acquiring an initial positioning pose exists, and when the situation occurs, failure reminding information can be displayed through the AR device to prompt a user that positioning fails.
Wherein, after obtaining initial positioning position appearance failure, show failure warning information through AR equipment, include:
in the first situation, if the current network connection of the AR equipment fails and the initial positioning pose fails to be acquired, first failure reminding information is displayed through the AR equipment.
And in case II, if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
In the first situation, in the process of acquiring the initial positioning pose of the AR device, the AR device is required to keep network connection successfully, and if the network of the AR device is not connected or network connection fails, the situation that the initial positioning pose cannot be acquired, that is, the situation that positioning fails occurs. Therefore, when the current network connection of the AR equipment fails and the initial positioning pose fails to be acquired, the first failure reminding information can be displayed through the AR equipment so as to prompt the user that the network connection of the AR equipment has problems. For example, the first failure reminding information may be reminding information for reminding a user to check whether a network is started, or the first failure reminding information may also be reminding information for reminding the user to contact a worker.
Referring to fig. 3a, in the AR positioning method, an interface schematic diagram of the AR device is also shown, where the interface schematic diagram also shows a first failure reminding message 31, where the first failure reminding message is a "positioning failure, please contact a worker".
For the second situation, when the current network connection of the AR device is successful, the initial positioning pose still exists, for example, if the image feature information of the real scene image captured by the AR device is less, the initial positioning pose of the AR device may not be determined based on the real scene image. Therefore, if the current network connection of the AR device is successful and the initial positioning pose acquisition fails, the AR device displays second failure notification information, for example, the second failure notification information may be notification information for prompting the user to change the position.
Referring to fig. 3b, in the AR positioning method, an interface schematic diagram of the AR device is also shown, where the interface schematic diagram also shows second failure warning information 32, where "please change the position, please try to change to another position and angle, and avoid being too close to the object or facing the wall or the ground" is the second failure warning information. Meanwhile, a retry function button is further arranged on the second failure reminding information, and after the retry function button is triggered, a positioning process (manual positioning process) is entered again, wherein the positioning process can be to determine an initial positioning pose of the AR device according to a real scene image shot by the AR device.
In the above embodiment, after the initial positioning pose is failed to be acquired, different failure reminding information may be displayed according to different failure reasons, for example, if the initial positioning pose is failed to be acquired due to the current network connection failure, the first failure reminding information may be displayed; if the current network connection of the AR equipment is successful, but the initial positioning pose is failed to acquire, the second failure reminding information can be displayed, so that a user can perform different processing according to different failure reminding information, and the initial positioning pose of the AR equipment can be acquired after different processing.
In an alternative embodiment, the method further comprises: displaying positioning reminding information through the AR equipment in the process of acquiring the initial positioning pose of the AR equipment based on the real scene image; wherein the positioning reminding information comprises: the information includes reminder information indicating that positioning is being performed, and/or information indicating photographing constraint conditions at the time of positioning.
Here, the positioning reminding information can also be displayed through the AR device in a project of acquiring an initial positioning pose of the AR device based on a real scene image.
Referring to fig. 4a, in the AR positioning method, an interface schematic diagram of the AR device is also shown, where the interface schematic diagram also shows a positioning reminding message 41, and the positioning reminding message may be a reminding message indicating that positioning is being performed, where "please wait in positioning" is the positioning reminding message. Or, referring to another AR positioning method shown in fig. 4b, an interface schematic diagram of the AR device is also shown, where the interface schematic diagram also shows positioning reminding information 41, and the positioning reminding information may be information indicating a shooting constraint condition during positioning, where "horizontal holding during prompting use, forward flushing of the camera, slow left-right movement" is the positioning reminding information.
By adopting the method, the positioning reminding information can be reminding information for indicating the positioning, and a visual display effect can be provided for the AR positioning process by displaying the positioning reminding information. The positioning reminding information can also be information indicating shooting constraint conditions during positioning, and a user can operate according to the displayed positioning reminding information by displaying the positioning reminding information, so that the efficiency of obtaining the initial positioning pose of the AR equipment is improved.
Referring to fig. 5a, a flow diagram of manual positioning in an AR positioning method, and referring to fig. 5b, a flow diagram of automatic positioning in an AR positioning method, a process of acquiring an initial positioning pose of an AR device is exemplarily described with reference to fig. 5a and 5 b. The positioning process triggered by the user is a manual positioning process; in the moving process of the AR equipment, the pose of the AR equipment can be automatically positioned, namely, the process of automatic positioning is carried out.
During specific implementation, after positioning is manually started or automatically started, namely, in the process of determining the initial positioning pose of the AR equipment based on the real scene image acquired by the AR equipment, the positioning button is displayed as positioning, the current AR content is maintained to be displayed, and positioning reminding information can be displayed through the AR equipment; wherein, this location warning message includes: the information includes reminder information indicating that positioning is being performed, and/or information indicating photographing constraint conditions at the time of positioning.
Meanwhile, judging whether the network of the AR equipment is connected or not, if not, determining the request times of network connection, judging whether the request is a 4 th request (the threshold value of the request times can be set according to needs), if not, trying to repeat the request, and returning to the step of judging whether the network is connected or not; if the request is the 4 th request, a popup window is arranged on the AR device to prompt a contact worker (namely, the AR device displays first failure reminding information).
If the network is connected, trying to obtain a positioning result, if an initial positioning pose is obtained, that is, determining that positioning is successful, reloading the AR content (that is, loading the AR content corresponding to the initial positioning pose, for example, the AR content may be a real scene image on which an AR display effect is superimposed), and entering a process of automatically monitoring and positioning, exemplarily, converting the obtained initial positioning pose to a virtual world coordinate system, and generating an initial positioning pose after coordinate conversion; if the initial positioning pose is failed to obtain, namely positioning is determined to fail, determining the number of positioning attempts, judging whether the request is the 4 th request (the positioning time threshold value can be set according to needs), and if not (namely the number of positioning attempts is less than 4), trying to obtain a positioning result again; if so (namely, the number of positioning attempts is equal to 4), in the manual positioning process, prompting information such as a prompt of changing the position and the like can be displayed on the AR equipment, namely, the AR equipment displays second failure prompting information; in the automatic positioning process, no prompt may be given, i.e., no prompt information is presented on the AR device.
After the initial positioning pose of the AR device is obtained, an automatic monitoring and positioning process may be performed, which is shown in fig. 5c, for a schematic view of a monitoring and positioning process in the AR positioning method. After entering the flow of auto-listen positioning, the active area may be listened to, as well as the 6Dof state. Wherein 6Dof is state data of the AR device in 6 degrees of freedom, and monitoring the state of 6Dof may be monitoring the state data of the AR device; the state data in 6 degrees of freedom includes displacement data in 3 directional axes and 3 rotation angle data.
Monitoring the state of the 6Dof can be to judge whether the monitored state data is lost, if not, returning to the process of continuing monitoring the state of the 6 Dof; if yes, starting automatic positioning, and entering an automatic positioning process. That is, in the case that it is determined that the monitored state data includes displacement data of N preset directional axes and/or rotation angle data of M rotation angles, it is determined that the AR device satisfies a preset state condition (there is no loss), and N and M are positive integers greater than 1, the process returns to the process of continuing to monitor the state of 6 Dof; and under the condition that the monitored state data does not comprise displacement data of preset N direction axes or rotation angle data of M rotation angles, determining that the AR equipment does not meet the preset state condition (the loss condition exists), starting automatic positioning, and entering the flow of automatic positioning.
The variation of the state data in the preset time can be monitored, if the variation of the displacement data and the variation of the rotation angle data of the AR equipment in the preset time are within the preset variation range, the AR equipment is determined to meet the preset state condition, and the process of continuously monitoring the 6Dof state is returned to; if the variation of the displacement data and the variation of the rotation angle data of the AR equipment in the preset time are out of the preset variation range, determining that the AR equipment does not meet the preset state condition, starting automatic positioning, and entering an automatic positioning process.
The process of listening to the active area may be: determining a real-time positioning pose of the AR device based on an initial positioning pose (exemplarily, the initial positioning pose after coordinate transformation can also be based on the initial positioning pose) under the condition that the AR device meets a preset state condition based on the monitored state data; and acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment under the condition that the AR equipment does not meet the preset state condition based on the monitored state data. And then, whether the AR equipment enters an area needing reminding (for example, the area needing reminding can be an area outside a target activity area) can be judged based on the real-time positioning pose, if so, a picture mask effect and prompt information (tips) corresponding to reality are displayed, namely AR prompt elements indicating a non-target activity area are displayed through the AR equipment.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same concept, an embodiment of the present disclosure further provides an apparatus for positioning an augmented reality AR, as shown in fig. 6, which is an architecture schematic diagram of the apparatus for positioning an augmented reality AR provided by the embodiment of the present disclosure, and the apparatus includes an obtaining module 601, a monitoring module 602, a first determining module 603, a second determining module 604, a first displaying module 605, a second displaying module 606, and a third displaying module 607, specifically:
an obtaining module 601, configured to obtain a real scene image captured by an AR device;
a monitoring module 602, configured to monitor state data of the AR device after an initial positioning pose of the AR device is successfully acquired based on the real scene image;
a first determining module 603, configured to determine a real-time positioning pose of the AR device based on the initial positioning pose and positioning data output by a positioning component in the AR device, when it is determined that the AR device satisfies a preset state condition based on the monitored state data.
In a possible embodiment, the apparatus further comprises:
a second determining module 604, configured to, when it is determined that the AR device does not meet a preset state condition based on the monitored state data, obtain a real-time positioning pose of the AR device based on a real scene image captured by the AR device.
In a possible implementation manner, the first determining module 603, when determining that the AR device satisfies the preset state condition based on the monitored state data, is configured to:
determining that the AR device meets a preset state condition under the condition that the monitored state data comprise displacement data and rotation angle data of the AR device, and the variation of the displacement data and the variation of the rotation angle data of the AR device in a preset time are determined to be in a preset variation range; and/or the presence of a gas in the gas,
determining that the AR device satisfies the preset state condition in a case where it is determined that the monitored state data includes displacement data at preset N directional axes, and/or rotation angle data at M rotation angles, where N and M are positive integers greater than 1.
In a possible embodiment, the apparatus further comprises:
a first presentation module 605, configured to present, by the AR device, an AR hint element indicating a non-target activity area if it is determined that the AR device is located outside the target activity area based on the real-time positioning pose of the AR device.
In one possible embodiment, the AR hint element indicating the non-target active area includes: and (5) masking effect.
In a possible embodiment, the apparatus further comprises:
and a second display module 606, configured to display failure prompt information through the AR device after the initial positioning pose fails to be acquired.
In a possible implementation manner, when the failure prompt information is displayed by the AR device after the initial positioning pose is failed to be acquired, the second display module 606 is configured to:
if the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, displaying first failure reminding information through the AR equipment;
and if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
In a possible embodiment, the apparatus further comprises:
a third display module 607, configured to display positioning reminding information through the AR device in a process of acquiring an initial positioning pose of the AR device based on the real scene image; wherein the positioning reminding information comprises: the information includes reminder information indicating that positioning is being performed, and/or information indicating photographing constraint conditions at the time of positioning.
In one possible implementation, the first determining module 603, when determining the real-time positioning pose of the AR device based on the initial positioning pose and the positioning data output by the positioning component of the AR device, is configured to:
converting the initial positioning pose to a virtual world coordinate system to generate a coordinate-converted initial positioning pose;
and determining pose information of the AR equipment in a virtual world coordinate system based on the initial positioning pose after the coordinate conversion and the positioning data output by the positioning component of the AR equipment, and determining the pose information as the real-time positioning pose of the AR equipment.
In some embodiments, the functions of the apparatus provided in the embodiments of the present disclosure or the included templates may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, no further description is provided here.
Based on the same technical concept, the embodiment of the disclosure also provides an electronic device. Referring to fig. 7, a schematic structural diagram of an electronic device 700 provided in the embodiment of the present disclosure includes a processor 701, a memory 702, and a bus 703. The memory 702 is used for storing execution instructions and includes a memory 7021 and an external memory 7022; the memory 7021 is also referred to as an internal memory, and is used to temporarily store operation data in the processor 701 and data exchanged with an external memory 7022 such as a hard disk, the processor 701 exchanges data with the external memory 7022 through the memory 7021, and when the electronic device 700 is operated, the processor 701 and the memory 702 communicate with each other through the bus 703, so that the processor 701 executes the following instructions:
acquiring a real scene image shot by AR equipment;
based on the real scene image, acquiring an initial positioning pose of the AR equipment and monitoring state data of the AR equipment;
and under the condition that the AR equipment meets the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.
In addition, the embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method for positioning an augmented reality AR described in the above method embodiments are performed.
The computer program product of the method for positioning an augmented reality AR provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute steps of the method for positioning an augmented reality AR described in the above method embodiments, which may be referred to in the above method embodiments specifically, and are not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above are only specific embodiments of the present disclosure, but the scope of the present disclosure is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present disclosure, and shall be covered by the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (12)

1. A method of Augmented Reality (AR) positioning, comprising:
acquiring a real scene image shot by AR equipment;
based on the real scene image, acquiring an initial positioning pose of the AR equipment and monitoring state data of the AR equipment;
and under the condition that the AR equipment meets the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.
2. The method of claim 1, further comprising:
and acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment under the condition that the AR equipment does not meet the preset state condition based on the monitored state data.
3. The method of claim 1 or 2, wherein determining that the AR device satisfies a preset status condition based on the monitored status data comprises:
determining that the AR device meets a preset state condition under the condition that the monitored state data comprise displacement data and rotation angle data of the AR device, and the variation of the displacement data and the variation of the rotation angle data of the AR device in a preset time are determined to be in a preset variation range; and/or the presence of a gas in the gas,
determining that the AR device satisfies the preset state condition in a case where it is determined that the monitored state data includes displacement data at preset N directional axes, and/or rotation angle data at M rotation angles, where N and M are positive integers greater than 1.
4. The method according to any one of claims 1 to 3, further comprising:
displaying, by the AR device, an AR prompt element indicating a non-target activity area if it is determined that the AR device is located outside a target activity area based on the real-time positioning pose of the AR device.
5. The method of claim 4, wherein the AR hint element indicating the non-target active area comprises: and (5) masking effect.
6. The method according to any one of claims 1 to 5, further comprising:
and displaying failure reminding information through the AR equipment after the initial positioning pose is failed to be acquired.
7. The method of claim 6, wherein after the initial positioning pose is unsuccessfully obtained, displaying failure reminding information through the AR device, comprising:
if the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, displaying first failure reminding information through the AR equipment;
and if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
8. The method of any one of claims 1 to 7, further comprising: displaying positioning reminding information through the AR equipment in the process of acquiring the initial positioning pose of the AR equipment based on the real scene image; wherein the positioning reminding information comprises: the information includes reminder information indicating that positioning is being performed, and/or information indicating photographing constraint conditions at the time of positioning.
9. The method of any one of claims 1 to 8, wherein determining the real-time positioning pose of the AR device based on the initial positioning pose and the positioning data output by the positioning component of the AR device comprises:
converting the initial positioning pose to a virtual world coordinate system to generate a coordinate-converted initial positioning pose;
and determining pose information of the AR equipment in a virtual world coordinate system based on the initial positioning pose after the coordinate conversion and the positioning data output by the positioning component of the AR equipment, and determining the pose information as the real-time positioning pose of the AR equipment.
10. An apparatus for Augmented Reality (AR) positioning, comprising:
the acquisition module is used for acquiring a real scene image shot by the AR equipment;
the monitoring module is used for monitoring the state data of the AR equipment after the initial positioning pose of the AR equipment is successfully acquired based on the real scene image;
the first determining module is used for determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component in the AR equipment under the condition that the AR equipment meets the preset state condition based on the monitored state data.
11. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of augmented reality AR positioning of any of claims 1 to 9.
12. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, performs the steps of the method for augmented reality AR localization according to any one of claims 1 to 9.
CN202011012903.3A 2020-09-23 2020-09-23 AR positioning method and device, electronic equipment and storage medium Active CN112181141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011012903.3A CN112181141B (en) 2020-09-23 2020-09-23 AR positioning method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011012903.3A CN112181141B (en) 2020-09-23 2020-09-23 AR positioning method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112181141A true CN112181141A (en) 2021-01-05
CN112181141B CN112181141B (en) 2023-06-23

Family

ID=73956913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011012903.3A Active CN112181141B (en) 2020-09-23 2020-09-23 AR positioning method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112181141B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112985419A (en) * 2021-05-12 2021-06-18 中航信移动科技有限公司 Indoor navigation method and device, computer equipment and storage medium
CN113359983A (en) * 2021-06-03 2021-09-07 北京市商汤科技开发有限公司 Augmented reality data presentation method and device, electronic equipment and storage medium
CN115690194A (en) * 2022-10-17 2023-02-03 广州赤兔宸行科技有限公司 Vehicle-mounted XR equipment positioning method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109166150A (en) * 2018-10-16 2019-01-08 青岛海信电器股份有限公司 Obtain the method, apparatus storage medium of pose
CN110031880A (en) * 2019-04-16 2019-07-19 杭州易绘科技有限公司 High-precision augmented reality method and apparatus based on Geographic mapping
WO2019205865A1 (en) * 2018-04-27 2019-10-31 腾讯科技(深圳)有限公司 Method, device and apparatus for repositioning in camera orientation tracking process, and storage medium
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium
CN110858414A (en) * 2018-08-13 2020-03-03 北京嘀嘀无限科技发展有限公司 Image processing method and device, readable storage medium and augmented reality system
CN111651057A (en) * 2020-06-11 2020-09-11 浙江商汤科技开发有限公司 Data display method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019205865A1 (en) * 2018-04-27 2019-10-31 腾讯科技(深圳)有限公司 Method, device and apparatus for repositioning in camera orientation tracking process, and storage medium
CN110858414A (en) * 2018-08-13 2020-03-03 北京嘀嘀无限科技发展有限公司 Image processing method and device, readable storage medium and augmented reality system
CN109166150A (en) * 2018-10-16 2019-01-08 青岛海信电器股份有限公司 Obtain the method, apparatus storage medium of pose
CN110031880A (en) * 2019-04-16 2019-07-19 杭州易绘科技有限公司 High-precision augmented reality method and apparatus based on Geographic mapping
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium
CN111651057A (en) * 2020-06-11 2020-09-11 浙江商汤科技开发有限公司 Data display method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨梦佳: "惯导-视觉SLAM技术综述", 《信息技术与信息化》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112985419A (en) * 2021-05-12 2021-06-18 中航信移动科技有限公司 Indoor navigation method and device, computer equipment and storage medium
CN112985419B (en) * 2021-05-12 2021-10-01 中航信移动科技有限公司 Indoor navigation method and device, computer equipment and storage medium
CN113359983A (en) * 2021-06-03 2021-09-07 北京市商汤科技开发有限公司 Augmented reality data presentation method and device, electronic equipment and storage medium
CN115690194A (en) * 2022-10-17 2023-02-03 广州赤兔宸行科技有限公司 Vehicle-mounted XR equipment positioning method, device, equipment and storage medium
CN115690194B (en) * 2022-10-17 2023-09-19 广州赤兔宸行科技有限公司 Vehicle-mounted XR equipment positioning method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112181141B (en) 2023-06-23

Similar Documents

Publication Publication Date Title
EP3951721A1 (en) Method and apparatus for determining occluded area of virtual object, and terminal device
CN112181141A (en) AR positioning method, AR positioning device, electronic equipment and storage medium
CN110276840B (en) Multi-virtual-role control method, device, equipment and storage medium
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
CN111311756B (en) Augmented reality AR display method and related device
JP7339386B2 (en) Eye-tracking method, eye-tracking device, terminal device, computer-readable storage medium and computer program
CN112506340B (en) Equipment control method, device, electronic equipment and storage medium
CN112179331B (en) AR navigation method, AR navigation device, electronic equipment and storage medium
WO2018233623A1 (en) Method and apparatus for displaying image
CN112348968B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
JP7078234B2 (en) How to create a 3D object to be placed in augmented reality space
CN112288882A (en) Information display method and device, computer equipment and storage medium
CN111640169A (en) Historical event presenting method and device, electronic equipment and storage medium
CN111651106A (en) Unread message prompting method, unread message prompting device, unread message prompting equipment and readable storage medium
CN114529647A (en) Object rendering method, device and apparatus, electronic device and storage medium
CN113178017A (en) AR data display method and device, electronic equipment and storage medium
CN106919260B (en) Webpage operation method and device
CN113160270A (en) Visual map generation method, device, terminal and storage medium
CN108241746B (en) Method and device for realizing visual public welfare activities
CN114187509B (en) Object positioning method and device, electronic equipment and storage medium
CN112991555B (en) Data display method, device, equipment and storage medium
CN112860060B (en) Image recognition method, device and storage medium
CN109675312B (en) Game item list display method and device
CN111953849A (en) Method and device for displaying message board, electronic equipment and storage medium
CN112699884A (en) Positioning method, positioning device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant