CN112181141B - AR positioning method and device, electronic equipment and storage medium - Google Patents

AR positioning method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112181141B
CN112181141B CN202011012903.3A CN202011012903A CN112181141B CN 112181141 B CN112181141 B CN 112181141B CN 202011012903 A CN202011012903 A CN 202011012903A CN 112181141 B CN112181141 B CN 112181141B
Authority
CN
China
Prior art keywords
equipment
positioning
pose
data
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011012903.3A
Other languages
Chinese (zh)
Other versions
CN112181141A (en
Inventor
侯欣如
欧华富
周玉杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202011012903.3A priority Critical patent/CN112181141B/en
Publication of CN112181141A publication Critical patent/CN112181141A/en
Application granted granted Critical
Publication of CN112181141B publication Critical patent/CN112181141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Software Systems (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present disclosure provides a method, an apparatus, an electronic device, and a storage medium for AR positioning, where the method includes: acquiring a real scene image shot by AR equipment; based on the real scene image, acquiring an initial positioning pose of the AR equipment and monitoring state data of the AR equipment; and under the condition that the AR equipment meets the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.

Description

AR positioning method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of augmented reality, in particular to an AR positioning method, an AR positioning device, electronic equipment and a storage medium.
Background
In general, augmented reality (Augmented Reality, AR) technology is a technology of fusing virtual information with a real world, and the technology can simulate and simulate computer-generated virtual information such as words, images, three-dimensional models, music, video, and the like, and then apply the simulated virtual information to the real world, so as to realize the enhancement of the real world, i.e. realize the presentation of virtual things in the real world.
In an AR scene, AR content needs to be shown based on the localization pose of the AR device, so the AR content shows closely related to the localization pose of the AR device.
Disclosure of Invention
In view of this, the present disclosure provides at least a method, apparatus, electronic device and storage medium for AR positioning.
In a first aspect, the present disclosure provides a method of AR positioning, comprising:
acquiring a real scene image shot by AR equipment;
based on the real scene image, acquiring an initial positioning pose of the AR equipment and monitoring state data of the AR equipment;
and under the condition that the AR equipment meets the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.
According to the method, the initial positioning pose of the AR equipment is acquired based on the acquired real scene image shot by the AR equipment, the state data of the AR equipment is monitored, and when the AR equipment is determined to meet the preset state condition based on the monitored state data, the real-time positioning pose of the AR equipment can be determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment. Because the accuracy of the initial positioning pose obtained based on the real scene image is higher, when the AR equipment is determined to meet the preset state condition, namely, the real-time positioning pose of the AR equipment is determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment under the condition that the determined real-time positioning pose is more accurate, the efficiency of determining the real-time positioning pose of the AR equipment can be improved.
In a possible embodiment, the method further comprises:
and under the condition that the AR equipment does not meet the preset state condition based on the monitored state data, acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment.
By adopting the method, when the AR equipment is determined to not meet the preset state condition based on the monitored state data, the real-time positioning pose of the AR equipment cannot be accurately determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment, so that the real-time positioning pose of the AR equipment can be accurately acquired based on the real scene image shot by the AR equipment.
In a possible implementation manner, based on the monitored state data, determining that the AR device meets a preset state condition includes:
when the monitored state data comprise displacement data and rotation angle data of the AR equipment, and the change amount of the displacement data and the change amount of the rotation angle data of the AR equipment in the preset time are in a preset change range, determining that the AR equipment meets a preset state condition; and/or the number of the groups of groups,
and under the condition that the monitored state data comprise displacement data of N preset direction axes and/or rotation angle data of M rotation angles, determining that the AR equipment meets the preset state conditions, wherein N and M are positive integers larger than 1.
Here, by the preset state condition set as described above, an accurate determination can be made as to the state of the AR device. For example, whether the AR device meets the stability condition can be judged by monitoring the state data, namely, whether the variable quantity of the displacement data and the variable quantity of the rotation angle data of the AR device in the preset time are in the preset variable range is determined, if so, the AR device meets the preset state condition, namely, the AR device meets the stability condition is determined. For another example, whether the state data is lost or not may be determined by monitoring the state data, that is, whether the monitored state data includes displacement data of N preset direction axes and/or rotation angle data of M rotation angles is determined, if so, it is determined that the AR device satisfies the preset state condition, that is, the state data is not lost.
In a possible embodiment, the method further comprises:
and displaying an AR prompt element indicating a non-target active area through the AR equipment under the condition that the AR equipment is determined to be positioned outside the target active area based on the real-time positioning pose of the AR equipment.
In the above embodiment, when determining that the AR device is located outside the target active area based on the real-time positioning pose of the AR device, the AR prompt element may be displayed by the AR device, and the user may be intuitively and clearly prompted to be located in the non-target active area by the displayed AR prompt element.
In a possible implementation manner, the AR prompt element indicating the non-target active area includes: masking effect.
In a possible embodiment, the method further comprises:
and after the initial positioning pose is failed to be acquired, displaying failure reminding information through the AR equipment.
In a possible implementation manner, after the initial positioning pose is obtained and fails, displaying failure reminding information through the AR device includes:
if the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, displaying first failure reminding information through the AR equipment;
and if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
In the above embodiment, after the initial positioning pose is failed to be acquired, different failure reminding information may be displayed according to different failure reasons, for example, if the initial positioning pose is failed to be acquired due to the current network connection failure, the first failure reminding information may be displayed; if the current network connection of the AR equipment is successful, but the initial positioning pose is failed, the second failure reminding information can be displayed, so that a user can perform different processing according to different failure reminding information, and the initial positioning pose of the AR equipment can be obtained after the different processing is performed.
In a possible embodiment, the method further comprises: displaying positioning reminding information through the AR equipment in the process of acquiring the initial positioning pose of the AR equipment based on the real scene image; wherein, the positioning reminding information comprises: reminding information indicating that positioning is being performed, and/or information indicating shooting constraint conditions at the time of positioning.
By adopting the method, the positioning reminding information can be the reminding information for indicating the positioning, and an intuitive display effect can be provided for the AR positioning process by displaying the positioning reminding information. The positioning reminding information can also be information indicating shooting constraint conditions in positioning, and by displaying the positioning reminding information, a user can operate according to the displayed positioning reminding information, so that the efficiency of acquiring the initial positioning pose of the AR equipment is improved.
In a possible implementation, determining a real-time localization pose of the AR device based on the initial localization pose and localization data output by a localization component of the AR device includes:
converting the initial positioning pose into a virtual world coordinate system, and generating the initial positioning pose after coordinate conversion;
And determining pose information of the AR equipment under a virtual world coordinate system based on the initial positioning pose after coordinate conversion and positioning data output by a positioning component of the AR equipment, and determining the pose information as real-time positioning pose of the AR equipment.
By adopting the method, the initial positioning pose can be converted into the virtual world coordinate system, the initial positioning pose after coordinate conversion is generated, the pose information of the AR equipment in the virtual world coordinate system is determined based on the initial positioning pose after coordinate conversion and the positioning data output by the positioning component of the AR equipment, the pose information is determined to be the real-time positioning pose of the AR equipment, and the pose information obtained in the positioning process is the pose in the virtual world coordinate system, so that the determined real-time positioning pose is more accurate.
The following description of the effects of the apparatus, the electronic device, etc. refers to the description of the above method, and will not be repeated here.
In a second aspect, the present disclosure provides an apparatus for AR positioning, comprising:
the acquisition module is used for acquiring the real scene image shot by the AR equipment;
the monitoring module is used for monitoring the state data of the AR equipment after the initial positioning pose of the AR equipment is successfully acquired based on the real scene image;
The first determining module is used for determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component in the AR equipment under the condition that the AR equipment meets the preset state condition based on the monitored state data.
In a possible embodiment, the apparatus further comprises:
and the second determining module is used for acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment under the condition that the AR equipment does not meet the preset state condition based on the monitored state data.
In a possible implementation manner, the first determining module is configured to, when determining, based on the monitored state data, that the AR device meets a preset state condition:
when the monitored state data comprise displacement data and rotation angle data of the AR equipment, and the change amount of the displacement data and the change amount of the rotation angle data of the AR equipment in the preset time are in a preset change range, determining that the AR equipment meets a preset state condition; and/or the number of the groups of groups,
and under the condition that the monitored state data comprise displacement data of N preset direction axes and/or rotation angle data of M rotation angles, determining that the AR equipment meets the preset state conditions, wherein N and M are positive integers larger than 1.
In a possible embodiment, the apparatus further comprises:
and the first display module is used for displaying an AR prompt element indicating a non-target active area through the AR equipment under the condition that the AR equipment is determined to be positioned outside the target active area based on the real-time positioning pose of the AR equipment.
In a possible implementation manner, the AR prompt element indicating the non-target active area includes: masking effect.
In a possible embodiment, the apparatus further comprises:
and the second display module is used for displaying failure reminding information through the AR equipment after the failure of acquiring the initial positioning pose.
In a possible implementation manner, after the initial positioning pose is obtained and failed, the second display module is configured to, when displaying the failure reminding information through the AR device:
if the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, displaying first failure reminding information through the AR equipment;
and if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
In a possible embodiment, the apparatus further comprises:
The third display module is used for displaying positioning reminding information through the AR equipment in the process of acquiring the initial positioning pose of the AR equipment based on the real scene image; wherein, the positioning reminding information comprises: reminding information indicating that positioning is being performed, and/or information indicating shooting constraint conditions at the time of positioning.
In a possible implementation manner, the first determining module is configured to, when determining the real-time positioning pose of the AR device based on the initial positioning pose and the positioning data output by the positioning component of the AR device:
converting the initial positioning pose into a virtual world coordinate system, and generating the initial positioning pose after coordinate conversion;
and determining pose information of the AR equipment under a virtual world coordinate system based on the initial positioning pose after coordinate conversion and positioning data output by a positioning component of the AR equipment, and determining the pose information as real-time positioning pose of the AR equipment.
In a third aspect, the present disclosure provides an electronic device comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of AR localization as described in the first aspect or any of the embodiments above.
In a fourth aspect, the present disclosure provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of AR localization as described in the first aspect or any of the embodiments above.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the embodiments are briefly described below, which are incorporated in and constitute a part of the specification, these drawings showing embodiments consistent with the present disclosure and together with the description serve to illustrate the technical solutions of the present disclosure. It is to be understood that the following drawings illustrate only certain embodiments of the present disclosure and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may admit to other equally relevant drawings without inventive effort.
Fig. 1 is a flow chart illustrating a method for AR positioning according to an embodiment of the present disclosure;
Fig. 2a illustrates an interface schematic diagram of an AR device in a method for AR positioning provided in an embodiment of the present disclosure;
FIG. 2b illustrates an interface schematic of an AR device in another AR positioning method provided by embodiments of the present disclosure;
fig. 3a illustrates an interface schematic diagram of an AR device in a method for AR positioning provided in an embodiment of the present disclosure;
FIG. 3b illustrates an interface diagram of an AR device in another AR positioning method provided by embodiments of the present disclosure;
fig. 4a illustrates an interface schematic diagram of an AR device in a method for AR positioning provided in an embodiment of the present disclosure;
FIG. 4b illustrates an interface schematic of an AR device in another AR positioning method provided by embodiments of the present disclosure;
fig. 5a shows a schematic flow chart of manual positioning in a method for AR positioning according to an embodiment of the present disclosure;
fig. 5b illustrates a flowchart of automatic positioning in a method for AR positioning according to an embodiment of the present disclosure;
fig. 5c illustrates a flowchart of listening positioning in a method for AR positioning provided in an embodiment of the present disclosure;
FIG. 6 illustrates an architectural diagram of an AR positioning apparatus provided by embodiments of the present disclosure;
Fig. 7 shows a schematic structural diagram of an electronic device 700 according to an embodiment of the disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, not all embodiments. The components of the embodiments of the present disclosure, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure provided in the accompanying drawings is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
In general, augmented reality (Augmented Reality, AR) technology is a technology of fusing virtual information with a real world, and the technology can simulate and simulate computer-generated virtual information such as words, images, three-dimensional models, music, video, and the like, and then apply the simulated virtual information to the real world, so as to realize the enhancement of the real world, i.e. realize the presentation of virtual things in the real world.
In an AR scene, AR content needs to be shown based on the localization pose of the AR device, so the AR content shows closely related to the localization pose of the AR device. In order to improve positioning accuracy, the embodiment of the disclosure provides a method for AR positioning.
The present invention is directed to a method for manufacturing a semiconductor device, and a semiconductor device manufactured by the method.
The following description of the embodiments of the present disclosure will be made clearly and fully with reference to the accompanying drawings in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present disclosure. The components of the present disclosure, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure provided in the accompanying drawings is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
To facilitate understanding of the embodiments of the present disclosure, a detailed description of a method for AR positioning disclosed in the embodiments of the present disclosure is first provided. The method for AR positioning provided by the embodiments of the present disclosure is generally performed by a computer device having a certain computing capability, where the computer device includes, for example: a terminal device or server or other processing device, which may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a personal digital assistant (Personal Digital Assistant, PDA), a handheld device, a computing device, a vehicle mount device, a wearable device, an augmented reality (Augmented reality, AR) device, etc. In some possible implementations, the method of AR localization may be implemented by way of a processor invoking computer readable instructions stored in a memory.
Referring to fig. 1, a flowchart of a method for AR positioning according to an embodiment of the present disclosure is shown, where the method includes: S101-S103, wherein:
S101, acquiring a real scene image shot by the AR equipment.
S102, based on the real scene image, acquiring the initial positioning pose of the AR equipment and monitoring the state data of the AR equipment.
S103, under the condition that the AR equipment meets the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.
According to the method, the initial positioning pose of the AR equipment is acquired based on the acquired real scene image shot by the AR equipment, the state data of the AR equipment is monitored, and when the AR equipment is determined to meet the preset state condition based on the monitored state data, the real-time positioning pose of the AR equipment can be determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment. Because the accuracy of the initial positioning pose obtained based on the real scene image is higher, when the AR equipment is determined to meet the preset state condition, namely, the real-time positioning pose of the AR equipment is determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment under the condition that the determined real-time positioning pose is more accurate, the efficiency of determining the real-time positioning pose of the AR equipment can be improved.
For S101 and for S102:
in the embodiment of the disclosure, the AR device is an intelligent device capable of supporting AR functions, and exemplary descriptions include, but are not limited to: and electronic equipment such as mobile phones, tablet computers and AR glasses capable of presenting an augmented reality effect.
Here, the real scene image may be an image photographed by the AR device in real time, and the real scene image may be a gray-scale image or a color image; meanwhile, the real scene image may be an image containing depth information or the like.
In implementation, a real scene image of the AR device may be acquired, and an initial positioning pose of the AR device may be acquired based on the real scene image, and meanwhile, status data of the AR device may also be monitored, for example, the status data may be translation data and rotation data, that is, data of 6 degrees of freedom dof.
In specific implementation, based on the real scene image, the process of determining the initial positioning pose of the AR device may be: extracting a plurality of feature point information included in the real scene image, matching the plurality of feature point information included in the extracted real scene image with the constructed three-dimensional scene map, and determining an initial positioning pose corresponding to the AR equipment.
The three-dimensional scene map may be constructed according to the following steps: acquiring a video corresponding to a real scene, sampling from the video to obtain multi-frame scene image samples, or directly acquiring multi-frame scene image samples corresponding to the real scene, and extracting a plurality of sample feature point information from the multi-frame scene image samples by using a neural network algorithm; and then, based on the extracted characteristic point information of the plurality of samples, a three-dimensional scene map can be constructed. The multi-frame scene image samples can be scene image samples of different angles in the real scene.
For S103:
in an alternative embodiment, the method further comprises: and under the condition that the AR equipment does not meet the preset state condition based on the monitored state data, acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment.
Whether the AR equipment meets the preset state condition or not can be judged based on the monitored state data, and if so, the real-time positioning pose of the AR equipment is determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment; if the real scene image is not satisfied, acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment.
By adopting the method, when the AR equipment is determined to not meet the preset state condition based on the monitored state data, the real-time positioning pose of the AR equipment cannot be accurately determined based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment, so that the real-time positioning pose of the AR equipment can be accurately acquired based on the real scene image shot by the AR equipment.
Wherein determining, based on the monitored status data, that the AR device meets a preset status condition comprises:
and under the condition that the monitored state data comprise displacement data and rotation angle data of the AR equipment, and the AR equipment is determined to meet the preset state condition under the condition that the change amount of the displacement data and the change amount of the rotation angle data of the AR equipment in the preset time are in the preset change range.
And under the condition II, determining that the AR equipment meets the preset state condition under the condition that the monitored state data comprises displacement data of N preset direction axes and/or rotation angle data of M rotation angles, wherein N and M are positive integers larger than 1.
Here, if the AR device is judged to meet the first condition based on the monitored state data, determining that the AR device meets the preset state condition; or if the AR equipment is judged to meet the second condition based on the monitored state data, determining that the AR equipment meets the preset state condition; or if the AR equipment is judged to meet the first condition and the second condition based on the monitored state data, determining that the AR equipment meets the preset state condition. Wherein the monitored status data may comprise displacement data and/or rotation angle data of the AR device. The displacement data may include data in the x-axis, y-axis, z-axis directions, respectively; the rotation angle data may include a pitch rotation angle (pitch), a roll rotation angle (roll), and a roll rotation angle (yaw).
Aiming at the first condition, when the monitored state data comprises displacement data and rotation angle data of the AR equipment, judging whether the variation of the displacement data of the AR equipment in the preset time is in a preset variation range or not, judging whether the variation of the rotation angle data of the AR equipment in the preset time is in the preset variation range or not, and if the variation of the displacement data and the variation of the rotation angle data are smaller than the preset variation range, determining that the AR equipment meets the preset state condition, namely determining that the AR equipment is in a stable state; if the change amount of the displacement data and/or the change amount of the rotation angle data is greater than or equal to the preset change range, determining that the AR device does not meet the preset state condition, namely determining that the AR device is in an unstable state. The change range corresponding to the displacement data and the change range corresponding to the rotation angle data can be set according to actual situations.
Aiming at the second condition, whether the monitored state data is lost or not can be judged, if so, the AR equipment is determined to not meet the preset state condition; if no loss exists, determining that the AR equipment meets the preset state condition. In the implementation, if the state data comprises preset displacement data of N direction axes and rotation angle data of M rotation angles, judging the type i of the displacement data and the type j of the rotation angle data included in the monitored state data, and if i is smaller than N and/or j is smaller than M, determining that the monitored state data is lost, namely determining that the AR equipment does not meet the preset state condition; if i is equal to N and j is equal to M, the monitored state data is determined to be not lost, namely the AR equipment is determined to meet the preset state condition. Wherein N and M are positive integers greater than 1; i and j are positive integers.
Here, by the preset state condition set as described above, an accurate determination can be made as to the state of the AR device. For example, whether the AR device meets the stability condition can be judged by monitoring the state data, namely, whether the variable quantity of the displacement data and the variable quantity of the rotation angle data of the AR device in the preset time are in the preset variable range is determined, if so, the AR device meets the preset state condition, namely, the AR device meets the stability condition is determined. For another example, whether the state data is lost or not may be determined by monitoring the state data, that is, whether the monitored state data includes displacement data of N preset direction axes and/or rotation angle data of M rotation angles is determined, if so, it is determined that the AR device satisfies the preset state condition, that is, the state data is not lost.
Determining, based on the monitored state data, a real-time positioning pose of the AR device based on the initial positioning pose and positioning data output by a positioning component of the AR device, where the AR device is determined to satisfy a preset state condition, includes:
s1031, converting the initial positioning pose into a virtual world coordinate system, and generating the initial positioning pose after coordinate conversion.
S1032, determining pose information of the AR equipment under a virtual world coordinate system based on the initial positioning pose after coordinate conversion and positioning data output by a positioning component of the AR equipment, and determining the pose information as real-time positioning pose of the AR equipment.
Here, after the initial positioning pose of the AR device is acquired based on the real scene image, the initial positioning pose is pose information under a coordinate system corresponding to the three-dimensional scene map, and the initial positioning pose may be converted into the virtual world coordinate system, so as to generate the initial positioning pose after coordinate conversion.
For example, a coordinate conversion matrix between a coordinate system corresponding to the three-dimensional scene map and a virtual world coordinate system may be determined, and the initial positioning pose is adjusted by using the coordinate conversion matrix, so as to generate the initial positioning pose after coordinate conversion.
After the initial positioning pose conversion value is placed in the virtual world coordinate system, the positioning data output by the positioning component of the AR equipment can be utilized to track the initial positioning pose, the pose information of the AR equipment in the virtual world coordinate system is determined, and the pose information is determined to be the real-time positioning pose of the AR equipment. The positioning data output by the positioning component of the AR equipment can be the moving distance and the rotating angle in a certain direction, and the initial positioning pose after the coordinate conversion is calculated by utilizing the moving distance and the rotating angle in a certain direction indicated by the positioning data, so that the pose information of the AR equipment under the virtual world coordinate system is determined, and the real-time positioning pose of the AR equipment is obtained. For example, if the position coordinates indicated by the initial positioning pose after the transformation are {1, 1}, the AR device indicated by the positioning data moves by 1 meter along the x-axis direction, and the position coordinates indicated by the pose information of the AR device under the virtual world coordinate system are {2, 1}.
Or, after the initial positioning pose is converted into the virtual world coordinate system, the initial positioning pose can be tracked by utilizing a synchronous positioning and mapping (simultaneous localization and mapping, SLAM) technology, pose information of the AR equipment in the virtual world coordinate system is determined, and the pose information is determined as the real-time positioning pose of the AR equipment. In specific implementation, the target initial positioning pose can be generated under the coordinate system corresponding to the transformed initial positioning pose transformation value SLAM, the AR equipment is tracked according to the SLAM technology, the intermediate pose information of the AR equipment under the coordinate system corresponding to the SLAM is determined, and the intermediate pose information is transformed to the virtual world coordinate system to obtain the real-time positioning pose of the AR equipment.
And under the condition that the AR equipment does not meet the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment and the constructed three-dimensional scene map.
By adopting the method, the initial positioning pose can be converted into the virtual world coordinate system, the initial positioning pose after coordinate conversion is generated, the pose information of the AR equipment in the virtual world coordinate system is determined based on the initial positioning pose after coordinate conversion and the positioning data output by the positioning component of the AR equipment, the pose information is determined to be the real-time positioning pose of the AR equipment, and the pose information obtained in the positioning process is the pose in the virtual world coordinate system, so that the determined real-time positioning pose is more accurate.
In an alternative embodiment, the method further comprises: and displaying an AR prompt element indicating a non-target active area through the AR equipment under the condition that the AR equipment is determined to be positioned outside the target active area based on the real-time positioning pose of the AR equipment. Wherein the AR hint element indicating a non-target active area comprises: masking effect.
After obtaining the real-time positioning pose of the AR equipment, monitoring whether the AR equipment is in the target active area, if so, not displaying the AR prompt element, and continuously monitoring whether the AR equipment is in the target active area; if not, that is, the AR device is determined to be outside the target active area, an AR prompt element indicating a non-target active area can be displayed through the AR device. Wherein the AR hint element indicating the non-target active area comprises: the masking effect is added on the display screen of the non-target active area, for example, a layer with a preset color can be added on the display screen of the non-target active area. In an alternative embodiment, the AR hint element indicating the non-target active area may further include: the popup window prompt effect is that prompt information is displayed on a display picture in a popup window mode so as to prompt a user to enter a non-target active area.
In the above embodiment, when determining that the AR device is located outside the target active area based on the real-time positioning pose of the AR device, the AR prompt element may be displayed by the AR device, and the user may be intuitively and clearly prompted to be located in the non-target active area by the displayed AR prompt element.
Referring to fig. 2a, an interface diagram of an AR device may include basic active task buttons, such as "checkpoint 1", "checkpoint 2", "checkpoint 3", and "checkpoint 4", respectively, located at the upper left side. By triggering the button corresponding to the checkpoint, a target active area corresponding to the checkpoint and a non-target active area other than the target active area may be determined. The interface schematic also shows an AR cue element 21 indicating a non-target active area, wherein the AR cue element indicating the non-target active area is a masking effect. Referring to another method of AR positioning shown in fig. 2b, an interface schematic diagram of an AR device is also shown in the interface schematic diagram, where an AR prompt element 21 indicating a non-target active area is also shown, where the AR prompt element indicating the non-target active area is a popup window prompt effect, and "enter the non-target active area, please note" in fig. 2b is the AR prompt element indicating the non-target active area.
In an alternative embodiment, the method further comprises: and after the initial positioning pose is failed to be acquired, displaying failure reminding information through the AR equipment.
In the implementation, there may be a positioning failure in the positioning process, that is, there is a failure in acquiring the initial positioning pose, and when this occurs, the failure reminding information may be displayed by the AR device to prompt the user that the positioning fails.
After failure of obtaining the initial positioning pose, displaying failure reminding information through the AR equipment, wherein the failure reminding information comprises the following steps:
in the first case, if the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, the AR equipment displays first failure reminding information.
And secondly, if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
Aiming at the first situation, in the process of acquiring the initial positioning pose of the AR equipment, the AR equipment is required to keep successful network connection, and if the network of the AR equipment is not connected or fails, the situation that the initial positioning pose cannot be acquired is caused, namely the situation that positioning fails is caused. Therefore, when the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, the AR equipment can display the first failure reminding information to prompt the user that the network connection of the AR equipment has a problem. For example, the first failure reminding information may be reminding information for reminding the user to check whether the network is opened or not, or the first failure reminding information may also be reminding information for reminding the user to contact with staff.
Referring to fig. 3a, an interface diagram of an AR device is shown, and the interface diagram also shows first failure reminding information 31, where "positioning fails, please contact a staff" is the first failure reminding information.
For the second case, when the current network connection of the AR device is successful, there is still a case that the initial positioning pose is not acquired, for example, if the acquired image feature information of the real scene image shot by the AR device is less, the initial positioning pose of the AR device may not be determined based on the real scene image. Therefore, if the current network connection of the AR device is successful and the initial positioning pose is not obtained, the AR device displays the second failure reminding information, for example, the second failure reminding information may be the reminding information for reminding the user of changing the position.
Referring to fig. 3b, an interface diagram of the AR device is shown, and the interface diagram also shows the second failure warning information 32, where "please change the position, please try to change to another position and angle, and avoid being too close to the object or facing the wall surface and the ground" is the second failure warning information. Meanwhile, a retry function button is further arranged on the second failure reminding information, and after the retry function button is triggered, a positioning process (manual positioning process) is re-entered, wherein the positioning process can be used for determining the initial positioning pose of the AR equipment according to the real scene image shot by the AR equipment.
In the above embodiment, after the initial positioning pose is failed to be acquired, different failure reminding information may be displayed according to different failure reasons, for example, if the initial positioning pose is failed to be acquired due to the current network connection failure, the first failure reminding information may be displayed; if the current network connection of the AR equipment is successful, but the initial positioning pose is failed, the second failure reminding information can be displayed, so that a user can perform different processing according to different failure reminding information, and the initial positioning pose of the AR equipment can be obtained after the different processing is performed.
In an alternative embodiment, the method further comprises: displaying positioning reminding information through the AR equipment in the process of acquiring the initial positioning pose of the AR equipment based on the real scene image; wherein, the positioning reminding information comprises: reminding information indicating that positioning is being performed, and/or information indicating shooting constraint conditions at the time of positioning.
Here, the positioning reminding information can be displayed through the AR device in the engineering of acquiring the initial positioning pose of the AR device based on the real scene image.
Referring to fig. 4a, an interface schematic diagram of an AR device is shown, and positioning reminding information 41 is also shown in the interface schematic diagram, where the positioning reminding information may be reminding information indicating that positioning is performed, and "wait in positioning" is the positioning reminding information. Or, referring to another AR positioning method shown in fig. 4b, an interface schematic diagram of the AR device is also shown in the interface schematic diagram, where the positioning reminding information may be information indicating shooting constraint conditions during positioning, where "the positioning reminding information is laterally held when in use, and the camera is moved left and right slowly before being flushed" is the positioning reminding information.
By adopting the method, the positioning reminding information can be the reminding information for indicating the positioning, and an intuitive display effect can be provided for the AR positioning process by displaying the positioning reminding information. The positioning reminding information can also be information indicating shooting constraint conditions in positioning, and by displaying the positioning reminding information, a user can operate according to the displayed positioning reminding information, so that the efficiency of acquiring the initial positioning pose of the AR equipment is improved.
Referring to fig. 5a, a schematic flow chart of manual positioning in the method of AR positioning, and a schematic flow chart of automatic positioning in the method of AR positioning shown in fig. 5b, a process of acquiring an initial positioning pose of an AR device is exemplarily described with reference to fig. 5a and 5 b. The positioning process triggered by the user is a manual positioning process; in the moving process of the AR equipment, the pose of the AR equipment can be automatically positioned, namely an automatic positioning process.
In the implementation, after the positioning is started manually or started automatically, the method enters a process of determining the initial positioning pose of the AR equipment based on the real scene image acquired by the AR equipment, and at the moment, the positioning button is displayed as being positioned, and the current AR content display is maintained, so that positioning reminding information can be displayed through the AR equipment; wherein, this location warning information includes: reminding information indicating that positioning is being performed, and/or information indicating shooting constraint conditions at the time of positioning.
Meanwhile, judging whether the network of the AR equipment is connected, if the network is not connected, determining the request times of network connection, judging whether the request is the 4 th time (the threshold of the request times can be set according to the requirement), if not (the request times are less than 4 times), trying to repeat the request, and returning to the step of judging whether the network is connected; if the first request is the 4 th request, a popup window is arranged on the AR equipment to prompt the contact staff (namely, the first failure reminding information is displayed through the AR equipment).
If the network is connected, an attempt is made to acquire a positioning result, if an initial positioning pose is acquired, that is, positioning is determined to be successful, AR content is reloaded (that is, AR content corresponding to the initial positioning pose is loaded, for example, the AR content may be a real scene image with an AR display effect superimposed thereon), and a process of automatic monitoring positioning is entered, and the acquired initial positioning pose may be converted into a virtual world coordinate system to generate an initial positioning pose after coordinate conversion; if the initial positioning pose is failed, namely positioning failure is determined, determining the positioning attempt times, judging whether the positioning attempt times are the 4 th (the threshold value of the positioning times can be set according to the requirement) requests, and if the initial positioning pose is not the initial positioning pose (namely the positioning attempt times are less than 4 times), attempting to acquire a positioning result again; if yes (i.e. the number of positioning attempts is equal to 4), in the manual positioning process, prompting information such as prompting replacement positions and the like can be displayed on the AR equipment, namely second failure prompting information is displayed through the AR equipment; in the automatic positioning process, the prompt information may not be displayed, i.e., on the AR device.
After the initial positioning pose of the AR device is obtained, a flow of automatic monitoring positioning may be entered, see a schematic flow diagram of monitoring positioning in a method of AR positioning shown in fig. 5 c. After entering the automatic listening and locating procedure, the active area may be listened to, and the status of 6Dof is listened to. Wherein 6Dof is state data of the AR device in 6 degrees of freedom, and the state of monitoring 6Dof may be state data of the AR device; the state data in 6 degrees of freedom includes displacement data in 3 direction axes and 3 rotation angle data.
The state of monitoring 6Dof may be a process of judging whether the monitored state data is lost, if not, returning to the state of continuing monitoring 6 Dof; if yes, starting automatic positioning and entering an automatic positioning process. Namely, under the condition that the monitored state data comprise displacement data of preset N direction axes and/or rotation angle data of M rotation angles, determining that the AR equipment meets the preset state condition (no loss exists), and returning to a process of continuing to monitor the state of 6Dof if N and M are positive integers larger than 1; and under the condition that the monitored state data does not comprise displacement data of the preset N direction axes or rotation angle data of the M rotation angles, if the AR equipment is determined to not meet the preset state condition (the condition of loss exists), starting automatic positioning, and entering a flow of automatic positioning.
The method can monitor the variable quantity of the state data in the preset time, if the variable quantity of the displacement data and the variable quantity of the rotation angle data of the AR equipment in the preset time are within the preset variable range, the AR equipment is determined to meet the preset state condition, and the process of continuously monitoring the state of 6Dof is returned; if the variable quantity of the displacement data and the variable quantity of the rotation angle data of the AR equipment in the preset time are out of the preset variable range, the AR equipment is determined to not meet the preset state condition, automatic positioning is started, and the automatic positioning process is started.
The process of listening to the active area may be: determining real-time positioning pose of the AR equipment based on the initial positioning pose (exemplarily, the initial positioning pose after coordinate conversion can also be based on the initial positioning pose) under the condition that the AR equipment meets the preset state condition based on the monitored state data; and under the condition that the AR equipment does not meet the preset state condition based on the monitored state data, acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment. And whether the AR equipment enters an area needing to be reminded (for example, the area needing to be reminded can be an area outside the target activity area) can be judged based on the real-time positioning pose, if so, the picture mask effect and the prompt information (tips) corresponding to reality are displayed, namely, AR prompt elements indicating the non-target activity area are displayed through the AR equipment.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Based on the same concept, the embodiment of the present disclosure further provides an apparatus for augmented reality AR positioning, referring to fig. 6, which is a schematic architecture diagram of the apparatus for augmented reality AR positioning provided by the embodiment of the present disclosure, including an obtaining module 601, a listening module 602, a first determining module 603, a second determining module 604, a first displaying module 605, a second displaying module 606, and a third displaying module 607, specifically:
an acquisition module 601, configured to acquire a real scene image captured by an AR device;
a monitoring module 602, configured to monitor state data of the AR device after the initial positioning pose of the AR device is successfully acquired based on the real scene image;
the first determining module 603 is configured to determine, based on the monitored state data, a real-time positioning pose of the AR device based on the initial positioning pose and positioning data output by a positioning component in the AR device, where the determining that the AR device meets a preset state condition.
In a possible embodiment, the apparatus further comprises:
a second determining module 604, configured to obtain, based on the real scene image captured by the AR device, a real-time positioning pose of the AR device when it is determined that the AR device does not meet the preset state condition based on the monitored state data.
In a possible implementation manner, the first determining module 603 is configured to, when determining, based on the monitored state data, that the AR device meets a preset state condition:
when the monitored state data comprise displacement data and rotation angle data of the AR equipment, and the change amount of the displacement data and the change amount of the rotation angle data of the AR equipment in the preset time are in a preset change range, determining that the AR equipment meets a preset state condition; and/or the number of the groups of groups,
and under the condition that the monitored state data comprise displacement data of N preset direction axes and/or rotation angle data of M rotation angles, determining that the AR equipment meets the preset state conditions, wherein N and M are positive integers larger than 1.
In a possible embodiment, the apparatus further comprises:
a first display module 605 is configured to display, by the AR device, an AR hint element indicating a non-target active area if it is determined that the AR device is located outside the target active area based on the real-time localization pose of the AR device.
In a possible implementation manner, the AR prompt element indicating the non-target active area includes: masking effect.
In a possible embodiment, the apparatus further comprises:
and a second display module 606, configured to display failure reminding information through the AR device after the initial positioning pose is failed.
In a possible implementation manner, after the initial positioning pose is obtained and failed, the second display module 606 is configured to, when displaying, by using the AR device, failure reminding information:
if the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, displaying first failure reminding information through the AR equipment;
and if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
In a possible embodiment, the apparatus further comprises:
a third display module 607, configured to display positioning reminding information through the AR device in a process of acquiring an initial positioning pose of the AR device based on the real scene image; wherein, the positioning reminding information comprises: reminding information indicating that positioning is being performed, and/or information indicating shooting constraint conditions at the time of positioning.
In a possible implementation manner, the first determining module 603 is configured to, when determining the real-time positioning pose of the AR device based on the initial positioning pose and the positioning data output by the positioning component of the AR device:
converting the initial positioning pose into a virtual world coordinate system, and generating the initial positioning pose after coordinate conversion;
and determining pose information of the AR equipment under a virtual world coordinate system based on the initial positioning pose after coordinate conversion and positioning data output by a positioning component of the AR equipment, and determining the pose information as real-time positioning pose of the AR equipment.
In some embodiments, the functions or templates included in the apparatus provided by the embodiments of the present disclosure may be used to perform the methods described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
Based on the same technical concept, the embodiment of the disclosure also provides electronic equipment. Referring to fig. 7, a schematic structural diagram of an electronic device 700 according to an embodiment of the disclosure includes a processor 701, a memory 702, and a bus 703. The memory 702 is configured to store execution instructions, including a memory 7021 and an external memory 7022; the memory 7021 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 701 and data exchanged with the external memory 7022 such as a hard disk, and the processor 701 exchanges data with the external memory 7022 through the memory 7021, and when the electronic device 700 is operated, the processor 701 and the memory 702 communicate through the bus 703, so that the processor 701 executes the following instructions:
Acquiring a real scene image shot by AR equipment;
based on the real scene image, acquiring an initial positioning pose of the AR equipment and monitoring state data of the AR equipment;
and under the condition that the AR equipment meets the preset state condition based on the monitored state data, determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component of the AR equipment.
Furthermore, the embodiments of the present disclosure also provide a computer readable storage medium, on which a computer program is stored, which when being executed by a processor performs the steps of the method of augmented reality AR localization described in the above method embodiments.
The computer program product of the method for augmented reality AR positioning provided in the embodiments of the present disclosure includes a computer readable storage medium storing program code, where the program code includes instructions for executing the steps of the method for augmented reality AR positioning described in the above method embodiments, and the specific reference may be made to the above method embodiments, which are not repeated herein.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the disclosure, but the protection scope of the disclosure is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the disclosure, and it should be covered in the protection scope of the disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (12)

1. A method of augmented reality AR localization, comprising:
acquiring a real scene image shot by AR equipment;
based on the real scene image, acquiring an initial positioning pose of the AR equipment and monitoring state data of the AR equipment;
determining a real-time positioning pose of the AR equipment based on the initial positioning pose and positioning data output by a positioning component of the AR equipment under the condition that the AR equipment meets a preset state condition based on the monitored state data; the preset state conditions comprise: the variable quantity of the displacement data and the variable quantity of the rotation angle data of the AR equipment in the preset time are in a preset variable range; and/or, there is no loss of the state data.
2. The method according to claim 1, wherein the method further comprises:
And under the condition that the AR equipment does not meet the preset state condition based on the monitored state data, acquiring the real-time positioning pose of the AR equipment based on the real scene image shot by the AR equipment.
3. The method according to claim 1 or 2, wherein determining that the AR device satisfies a preset status condition based on the monitored status data comprises:
when the monitored state data comprise displacement data and rotation angle data of the AR equipment, and the change amount of the displacement data and the change amount of the rotation angle data of the AR equipment in the preset time are in a preset change range, determining that the AR equipment meets a preset state condition; and/or the number of the groups of groups,
and under the condition that the monitored state data comprise displacement data of N preset direction axes and/or rotation angle data of M rotation angles, determining that the AR equipment meets the preset state conditions, wherein N and M are positive integers larger than 1.
4. The method according to claim 1 or 2, characterized in that the method further comprises:
and displaying an AR prompt element indicating a non-target active area through the AR equipment under the condition that the AR equipment is determined to be positioned outside the target active area based on the real-time positioning pose of the AR equipment.
5. The method of claim 4, wherein the AR hint element indicating a non-target active area comprises: masking effect.
6. The method according to claim 1 or 2, characterized in that the method further comprises:
and after the initial positioning pose is failed to be acquired, displaying failure reminding information through the AR equipment.
7. The method of claim 6, wherein displaying, by the AR device, failure alert information after failure to acquire the initial positioning pose, comprises:
if the current network connection of the AR equipment fails and the initial positioning pose acquisition fails, displaying first failure reminding information through the AR equipment;
and if the current network connection of the AR equipment is successful and the initial positioning pose acquisition fails, displaying second failure reminding information through the AR equipment.
8. The method according to claim 1 or 2, characterized in that the method further comprises: displaying positioning reminding information through the AR equipment in the process of acquiring the initial positioning pose of the AR equipment based on the real scene image; wherein, the positioning reminding information comprises: reminding information indicating that positioning is being performed, and/or information indicating shooting constraint conditions at the time of positioning.
9. The method of claim 1 or 2, wherein determining the real-time localization pose of the AR device based on the initial localization pose and localization data output by a localization component of the AR device comprises:
converting the initial positioning pose into a virtual world coordinate system, and generating the initial positioning pose after coordinate conversion;
and determining pose information of the AR equipment under a virtual world coordinate system based on the initial positioning pose after coordinate conversion and positioning data output by a positioning component of the AR equipment, and determining the pose information as real-time positioning pose of the AR equipment.
10. An apparatus for augmented reality AR positioning, comprising:
the acquisition module is used for acquiring the real scene image shot by the AR equipment;
the monitoring module is used for monitoring the state data of the AR equipment after the initial positioning pose of the AR equipment is successfully acquired based on the real scene image;
the first determining module is used for determining the real-time positioning pose of the AR equipment based on the initial positioning pose and the positioning data output by the positioning component in the AR equipment under the condition that the AR equipment meets the preset state condition based on the monitored state data; the preset state conditions comprise: the variable quantity of the displacement data and the variable quantity of the rotation angle data of the AR equipment in the preset time are in a preset variable range; and/or, there is no loss of the state data.
11. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory in communication over the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of augmented reality AR localization according to any one of claims 1 to 9.
12. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the method of augmented reality AR localization according to any one of claims 1 to 9.
CN202011012903.3A 2020-09-23 2020-09-23 AR positioning method and device, electronic equipment and storage medium Active CN112181141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011012903.3A CN112181141B (en) 2020-09-23 2020-09-23 AR positioning method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011012903.3A CN112181141B (en) 2020-09-23 2020-09-23 AR positioning method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112181141A CN112181141A (en) 2021-01-05
CN112181141B true CN112181141B (en) 2023-06-23

Family

ID=73956913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011012903.3A Active CN112181141B (en) 2020-09-23 2020-09-23 AR positioning method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112181141B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112985419B (en) * 2021-05-12 2021-10-01 中航信移动科技有限公司 Indoor navigation method and device, computer equipment and storage medium
CN113359983A (en) * 2021-06-03 2021-09-07 北京市商汤科技开发有限公司 Augmented reality data presentation method and device, electronic equipment and storage medium
CN115690194B (en) * 2022-10-17 2023-09-19 广州赤兔宸行科技有限公司 Vehicle-mounted XR equipment positioning method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109166150A (en) * 2018-10-16 2019-01-08 青岛海信电器股份有限公司 Obtain the method, apparatus storage medium of pose
CN110031880A (en) * 2019-04-16 2019-07-19 杭州易绘科技有限公司 High-precision augmented reality method and apparatus based on Geographic mapping
WO2019205865A1 (en) * 2018-04-27 2019-10-31 腾讯科技(深圳)有限公司 Method, device and apparatus for repositioning in camera orientation tracking process, and storage medium
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium
CN110858414A (en) * 2018-08-13 2020-03-03 北京嘀嘀无限科技发展有限公司 Image processing method and device, readable storage medium and augmented reality system
CN111651057A (en) * 2020-06-11 2020-09-11 浙江商汤科技开发有限公司 Data display method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019205865A1 (en) * 2018-04-27 2019-10-31 腾讯科技(深圳)有限公司 Method, device and apparatus for repositioning in camera orientation tracking process, and storage medium
CN110858414A (en) * 2018-08-13 2020-03-03 北京嘀嘀无限科技发展有限公司 Image processing method and device, readable storage medium and augmented reality system
CN109166150A (en) * 2018-10-16 2019-01-08 青岛海信电器股份有限公司 Obtain the method, apparatus storage medium of pose
CN110031880A (en) * 2019-04-16 2019-07-19 杭州易绘科技有限公司 High-precision augmented reality method and apparatus based on Geographic mapping
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium
CN111651057A (en) * 2020-06-11 2020-09-11 浙江商汤科技开发有限公司 Data display method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
惯导-视觉SLAM技术综述;杨梦佳;《信息技术与信息化》;20190731(第07期);全文 *

Also Published As

Publication number Publication date
CN112181141A (en) 2021-01-05

Similar Documents

Publication Publication Date Title
CN112181141B (en) AR positioning method and device, electronic equipment and storage medium
CN110352446B (en) Method and apparatus for obtaining image and recording medium thereof
EP3951721A1 (en) Method and apparatus for determining occluded area of virtual object, and terminal device
CN109947886B (en) Image processing method, image processing device, electronic equipment and storage medium
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
CN108805917A (en) Sterically defined method, medium, device and computing device
CN112348968B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
KR20150059466A (en) Method and apparatus for recognizing object of image in electronic device
KR20180013277A (en) Electronic apparatus for displaying graphic object and computer readable recording medium
CN112148188A (en) Interaction method and device in augmented reality scene, electronic equipment and storage medium
CN111311756A (en) Augmented reality AR display method and related device
CN112882576B (en) AR interaction method and device, electronic equipment and storage medium
CN112291473B (en) Focusing method and device and electronic equipment
KR102337209B1 (en) Method for notifying environmental context information, electronic apparatus and storage medium
CN112733641A (en) Object size measuring method, device, equipment and storage medium
CN110544315B (en) Virtual object control method and related equipment
CN113178017A (en) AR data display method and device, electronic equipment and storage medium
CN108052506B (en) Natural language processing method, device, storage medium and electronic equipment
KR20180082273A (en) Computer readable recording medium and electronic apparatus for performing video call
CN112017304B (en) Method, apparatus, electronic device and medium for presenting augmented reality data
CN114067085A (en) Virtual object display method and device, electronic equipment and storage medium
CN112788443B (en) Interaction method and system based on optical communication device
CN108537149A (en) Image processing method, device, storage medium and electronic equipment
CN111212234A (en) Shooting method, device, equipment and storage medium
CN111899349A (en) Model presentation method and device, electronic equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant