CN111698646A - Positioning method and device - Google Patents

Positioning method and device Download PDF

Info

Publication number
CN111698646A
CN111698646A CN202010514659.4A CN202010514659A CN111698646A CN 111698646 A CN111698646 A CN 111698646A CN 202010514659 A CN202010514659 A CN 202010514659A CN 111698646 A CN111698646 A CN 111698646A
Authority
CN
China
Prior art keywords
target
information
pose information
determining
relative pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010514659.4A
Other languages
Chinese (zh)
Other versions
CN111698646B (en
Inventor
潘思霁
刘小兵
李炳泽
揭志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202010514659.4A priority Critical patent/CN111698646B/en
Publication of CN111698646A publication Critical patent/CN111698646A/en
Application granted granted Critical
Publication of CN111698646B publication Critical patent/CN111698646B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/024Guidance services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a positioning method and apparatus, including: acquiring a live-action image of a target place shot by each AR device in a plurality of AR devices corresponding to an associated user group; determining current pose information of each AR device based on the live-action image of the target place shot by each AR device; the pose information comprises position information and orientation information; for a target AR device in the plurality of AR devices, determining relative pose information of at least one other AR device relative to the target AR device based on the current respective pose information of the plurality of AR devices; and generating relative pose prompt information for the target AR equipment according to the determined relative pose information, and sending the relative pose prompt information to the target AR equipment so as to present the relative pose prompt information of the at least one other AR equipment in a live-action image shot by the target AR equipment.

Description

Positioning method and device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a positioning method and apparatus.
Background
In a large amusement place, due to the complex scene, tourists are difficult to find the mutual positions, navigation is usually carried out by means of a traditional GPS positioning product, but the positioning accuracy of a GPS is limited, partial indoor areas are possibly weak in GPS signals, partial areas cannot be used, and further mutual positioning in a large amusement park is difficult.
Disclosure of Invention
The embodiment of the disclosure at least provides a positioning method and a positioning device.
In a first aspect, an embodiment of the present disclosure provides a positioning method, including:
acquiring a live-action image of a target place shot by each AR device in a plurality of AR devices corresponding to an associated user group;
determining current pose information of each AR device based on the live-action image of the target place shot by each AR device; the pose information comprises position information and orientation information;
for a target AR device in the plurality of AR devices, determining relative pose information of at least one other AR device relative to the target AR device based on the current respective pose information of the plurality of AR devices;
and generating relative pose prompt information for the target AR equipment according to the determined relative pose information, and sending the relative pose prompt information to the target AR equipment so as to present the relative pose prompt information of the at least one other AR equipment in a live-action image shot by the target AR equipment.
By the method, the current pose information of each AR device can be determined directly according to the live-action image shot by each AR device in the associated user group, and after the target AR device is determined, the relative pose prompt information of the target AR device and other AR devices to drink is displayed on the target AR device.
In one possible implementation, after determining the current pose information of each AR device, the method further comprises:
and determining the distance information of the at least one other AR device relative to the target AR device based on the current pose information of each AR device, and sending the distance information to the target AR device for display.
In a possible implementation, the distance information of the at least one other AR device with respect to the target AR device includes a straight-line distance between the at least one other AR device and the target AR device, and/or a walking distance between the at least one other AR device and the target AR device.
In one possible embodiment, the determining current pose information of each AR device based on a live-action image of a target site captured by each AR device includes:
and aiming at each AR device, matching the live-action image of the target place shot by the AR device with a pre-established three-dimensional model corresponding to the target place, and determining the current pose information of the AR device based on the matching result.
In one possible embodiment, the determining the relative pose information of at least one other AR device with respect to the target AR device based on the current respective pose information of the plurality of AR devices includes:
and determining at least one other AR device which is not in the live-action image shot by the target AR device from the plurality of AR devices based on the current respective pose information of the plurality of AR devices, and determining the relative pose information of the at least one other AR device relative to the target AR device.
In one possible implementation, the target AR device is determined according to the following method:
and determining any one AR device as the target AR device in response to a preset friend positioning trigger instruction generated by the user based on any one of the plurality of AR devices.
In one possible embodiment, the generating relative pose hint information for the target AR device according to the determined relative pose information includes:
determining a virtual indicator corresponding to the relative pose information based on the relative pose information of the at least one other AR device with respect to the target AR device;
and taking the virtual indicator corresponding to the relative pose information of each other AR device relative to the target AR device as the relative pose prompt information.
In a second aspect, an embodiment of the present disclosure further provides a positioning apparatus, including:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a live-action image of a target place shot by each AR device in a plurality of AR devices corresponding to an associated user group;
the first determining module is used for determining the current pose information of each AR device based on the live-action image of the target place shot by each AR device; the pose information comprises position information and orientation information;
a second determining module, configured to determine, for a target AR device of the multiple AR devices, relative pose information of at least one other AR device with respect to the target AR device based on current respective pose information of the multiple AR devices;
and the sending module is used for generating relative pose prompt information for the target AR equipment according to the determined relative pose information and sending the relative pose prompt information to the target AR equipment so as to present the relative pose prompt information of the at least one other AR equipment in a live-action image shot by the target AR equipment.
In one possible implementation, after determining the current pose information of each AR device, the first determining module is further configured to:
and determining the distance information of the at least one other AR device relative to the target AR device based on the current pose information of each AR device, and sending the distance information to the target AR device for display.
In a possible implementation, the distance information of the at least one other AR device with respect to the target AR device includes a straight-line distance between the at least one other AR device and the target AR device, and/or a walking distance between the at least one other AR device and the target AR device.
In one possible implementation, the first determining module, when determining the current pose information of each AR device based on the live-action image of the target site captured by each AR device, is configured to:
and aiming at each AR device, matching the live-action image of the target place shot by the AR device with a pre-established three-dimensional model corresponding to the target place, and determining the current pose information of the AR device based on the matching result.
In one possible embodiment, the second determining module, when determining the relative pose information of at least one other AR device with respect to the target AR device based on the current respective pose information of the plurality of AR devices, is configured to:
and determining at least one other AR device which is not in the live-action image shot by the target AR device from the plurality of AR devices based on the current respective pose information of the plurality of AR devices, and determining the relative pose information of the at least one other AR device relative to the target AR device.
In a possible implementation, the second determining module is further configured to determine the target AR device according to the following method:
and determining any one AR device as the target AR device in response to a preset friend positioning trigger instruction generated by the user based on any one of the plurality of AR devices.
In one possible implementation, the sending module, when generating the relative pose hint information for the target AR device according to the determined relative pose information, is configured to:
determining a virtual indicator corresponding to the relative pose information based on the relative pose information of the at least one other AR device with respect to the target AR device;
and taking the virtual indicator corresponding to the relative pose information of each other AR device relative to the target AR device as the relative pose prompt information.
In a third aspect, an embodiment of the present disclosure further provides a computer device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the computer device is running, the machine-readable instructions when executed by the processor performing the steps of the first aspect described above, or any possible implementation of the first aspect.
In a fourth aspect, this disclosed embodiment also provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps in the first aspect or any one of the possible implementation manners of the first aspect.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 shows a flow chart of a positioning method provided by an embodiment of the present disclosure;
fig. 2 illustrates an AR device location presentation diagram provided by an embodiment of the present disclosure;
fig. 3 illustrates another AR device location presentation diagram provided by an embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating an architecture of a positioning apparatus provided in an embodiment of the present disclosure;
fig. 5 shows a schematic structural diagram of a computer device 500 provided by the embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
In the related art, when two users perform positioning through GPS, the GPS positioning is easily affected by the positioning environment, for example, in a basement or the like, the GPS signal is weak, which may result in positioning failure.
Based on this, the present disclosure provides a positioning method and apparatus, which may determine current pose information of each AR device directly according to a live-action image captured by each AR device in an associated user group, and after determining a target AR device, display a relative pose prompt message of the target AR device and the relative pose information of other AR devices to drink on the target AR device.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In order to facilitate understanding of the present embodiment, a positioning method disclosed in the embodiments of the present disclosure is first described in detail, and an execution subject of the positioning method provided in the embodiments of the present disclosure is generally a server.
Referring to fig. 1, a flowchart of a positioning method provided in an embodiment of the present disclosure is shown, where the method includes steps 101 to 104, where:
step 101, acquiring a live-action image of a target place shot by each AR device in a plurality of AR devices corresponding to an associated user group.
The associated user group includes a plurality of users with pre-established association, for example, the user a and the user B have pre-established association, so that the user a and the user B may serve as an associated user group, and the association may be a friend relationship, for example.
The users in the associated user group can shoot the live-action images of the target place through the AR equipment, the users can be located in different position areas of the target place, and the shot live-action images are sent to the server and processed by the server. Illustratively, the target location may be a casino, exhibition hall, tourist attraction, or the like.
102, determining current pose information of each AR device based on a live-action image of a target place shot by each AR device; the pose information includes position information and orientation information.
In one possible implementation, when determining the current pose information of each AR device based on a live-action image of a target location captured by each AR device, for each AR device, the live-action image of the target location captured by the AR device may be matched with a pre-established three-dimensional model corresponding to the target location, and the current pose information of the AR device may be determined based on a matching result.
The three-dimensional model corresponding to the target location may be matched with the live-action images of the target location taken at various angles, and if a certain area of the three-dimensional model of the target location is successfully matched with the live-action images taken by the AR device, the current position information of the AR device in the target location and the orientation thereof may be determined.
Here, the position information may be position coordinates in a world coordinate system established at a target location, and the orientation information may be an angle between the AR apparatus and an x-axis of the world coordinate system when the real image is captured.
Step 103, for a target AR device of the multiple AR devices, determining, based on the current respective pose information of the multiple AR devices, relative pose information of at least one other AR device with respect to the target AR device.
The target AR device may be an AR device whose corresponding user needs to acquire relative pose information of users corresponding to other AR devices, and specifically, the target AR device may be determined according to the following method:
and determining any one AR device as the target AR device in response to a preset friend positioning trigger instruction generated by the user based on any one of the plurality of AR devices.
The preset friend positioning triggering instruction may be generated after a user corresponding to the any one AR device triggers a button on the any one AR device for acquiring positioning information, or may be detected based on sound information and/or image information acquired by the any one AR device.
Specifically, the preset friend positioning trigger instruction may include a preset sound trigger instruction and/or a preset limb action trigger instruction, and when the preset friend positioning trigger instruction includes the preset sound trigger instruction, the preset friend positioning trigger instruction may control any one of the AR devices to acquire sound information, and then send the sound information to the server, where the server performs voice recognition on the sound information, and detects whether the user sends the preset friend positioning trigger instruction based on a recognition result; or, any one of the AR devices may be controlled to collect the sound information, then the any one of the AR devices performs voice recognition on the collected sound information, and then sends the recognition result to the server, and the server detects whether the user sends a preset friend positioning trigger instruction based on the recognition result.
For example, if a preset sound trigger instruction included in the preset friend location trigger instruction is voice information corresponding to "please inquire the positions of other friends for me", any one of the AR devices may be controlled to collect the voice information, and then the voice information is subjected to voice recognition, if a recognition result includes the voice information corresponding to "please inquire the positions of other friends for me", it is determined that the user sends the preset sound trigger instruction, and then any one of the AR devices is determined as a target AR device.
When the preset friend positioning trigger instruction comprises a preset limb action trigger instruction, any one AR device can be controlled to collect image information, namely, any one AR device is controlled to collect a live-action image, then any one AR device can send the collected live-action image to a server, the server detects whether the user performs the preset limb action in the collected live-action image, if the user performs the preset limb action in the live-action image, the user is determined to send the preset friend positioning trigger instruction, and any one AR device is determined to be the target AR device.
For example, if the limb movement is preset as "waving", the server may detect whether the user in the live-action image performs the limb movement of "waving" after receiving the live-action image acquired by the AR device, and if so, determine that the user sends a preset friend positioning trigger instruction based on the AR device, and determine that the AR device is the target AR device.
Here, for the server, there may be a plurality of target AR devices, and for any AR device designated by sending the preset friend location trigger, the server may use the target AR device as the target AR device. For example, if the AR devices corresponding to the associated user group include the AR device 1, the AR device 2, the AR device 3, the AR device 4, and the AR device 5, after taking the AR device 1 as the target AR device, the AR device 2, the AR device 3, the AR device 4, and the AR device 5 are all other AR devices, and after taking the AR device 5 as the target device, the AR device 1, the AR device 2, the AR device 3, and the AR device 4 are all other AR devices.
Here, when the any AR device collects a live-action image, the collected image includes a plurality of consecutive live-action images, or may collect a live-action video, and then transmit each video frame of the live-action video as the live-action image to the server.
When the server detects whether the user performs the preset limb movement in the collected live-action image, in a possible implementation manner, the live-action image may be input into a pre-trained neural network to obtain a limb movement detection result corresponding to the live-action image, and then whether the user performs the preset limb movement in the live-action image may be determined based on the limb movement detection result.
In another possible implementation manner, the position information of the preset body position points of the user in each live-action image may be detected, then the mutual position relationship between the preset body position points is determined based on the position information of the preset body position points, and when the mutual position relationship satisfies the mutual position relationship corresponding to the preset limb action, it may be determined that the user makes the preset limb action in the live-action image.
The preset body position point may be, for example, a head, a palm, a big arm, a small leg, a knee, or the like, or a more detailed preset body position point may be set, for example, a little finger tip, a thumb tip, or the like.
In one possible implementation, when determining the relative pose information of at least one other AR device with respect to the target AR device based on the current respective pose information of the plurality of AR devices, at least one other AR device that is not in the live-action image captured by the target AR device may be determined from the plurality of AR devices based on the current respective pose information of the plurality of AR devices, and the relative pose information of the at least one other AR device with respect to the target AR device may be determined.
Specifically, after a live-action image shot by a target AR device is acquired, face recognition may be performed on the live-action image shot by the target AR device, and face information of each user located in the live-action image is determined based on a recognition result, the face information of each user of an associated user group may be stored in the server in advance, and if it is detected that a user in any associated user group is located in the live-action image shot by the target AR device based on the recognized face information, it is indicated that the user is close to the user corresponding to the target AR device, and in order to save computing resources, relative pose information between the user and the user corresponding to the target AR device may not be determined.
And step 104, generating relative pose prompt information for the target AR device according to the determined relative pose information, and sending the relative pose prompt information to the target AR device so as to present the relative pose prompt information of the at least one other AR device in a live-action image shot by the target AR device.
In a possible implementation manner, the relative pose indicating information may be represented in the form of a virtual indicator, and specifically, when generating the relative pose indicating information for the target AR device according to the determined relative pose information, a virtual indicator corresponding to the relative pose information may be determined based on the relative pose information of the at least one other AR device with respect to the target AR device; and then taking the virtual indicator corresponding to the relative pose information of each other AR device relative to the target AR device as the relative pose prompt information.
The virtual indicator corresponding to the relative pose information corresponding to each AR device may include a virtual indicator corresponding to the identification information of the AR device and a virtual arrow corresponding to the orientation information in the relative pose information.
The virtual indicator corresponding to the identification information of the AR device may be, for example, a virtual avatar set by a user of the AR device, and the virtual arrow corresponding to the orientation information in the relative pose information may be a virtual arrow pointing to a direction corresponding to the orientation information.
For example, if the positions of the AR device 1 and the AR device 2 are as shown in fig. 2, and the AR device 1 is taken as a target AR device, the virtual pointer corresponding to the relative pose information of the AR device 2 with respect to the AR device 1 is the virtual pointer corresponding to the identification information of the AR device 2, and the virtual arrow pointing to the direction corresponding to the orientation information in the relative pose information.
After the virtual indicators corresponding to the relative pose information are determined, the display positions of the virtual indicators and the corresponding virtual arrows can be determined according to the orientation information in the relative pose information, the virtual indicators and the virtual arrows are displayed at the determined display positions, the displayed virtual indicators and the virtual arrows are fused with the live-action image shot by the target AR device, and then the displayed virtual indicators and the virtual arrows are displayed through the target AR device.
In a possible implementation manner, after determining the current pose information of each AR device, distance information of the at least one other AR device with respect to the target AR device may be further determined based on the current pose information of each AR device, and the distance information is sent to the target AR device for presentation.
The distance information of the at least one other AR device relative to the target AR device includes a linear distance between the at least one other AR device and the target AR device, and/or a walking distance between the at least one other AR device and the target AR device.
For example, as shown in fig. 3, if the target AR device is at a and the other AR devices are at B, the straight-line distances between the other AR devices and the target AR device are shown as a straight line 1, and the walking distances between the other AR devices and the target AR device are shown as roads in the figure.
According to the positioning method provided by the disclosure, the current pose information of each AR device can be determined directly according to the live-action image shot by each AR device in the associated user group, and after the target AR device is determined, the prompt information of the relative pose information of the target AR device and other AR devices to the drinking is displayed on the target AR device.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, the embodiment of the present disclosure further provides a positioning apparatus corresponding to the positioning method, and since the principle of the apparatus in the embodiment of the present disclosure for solving the problem is similar to the positioning method described above in the embodiment of the present disclosure, the implementation of the apparatus may refer to the implementation of the method, and repeated details are not repeated.
Referring to fig. 4, there is shown a schematic structural diagram of a positioning apparatus according to an embodiment of the present disclosure, the apparatus includes: an obtaining module 401, a first determining module 402, a second determining module 403, and a sending module 404; wherein,
an obtaining module 401, configured to obtain a live-action image of a target location captured by each AR device of multiple AR devices corresponding to an associated user group;
a first determining module 402, configured to determine current pose information of each AR device based on a live-action image of a target location captured by each AR device; the pose information comprises position information and orientation information;
a second determining module 403, configured to determine, for a target AR device of the multiple AR devices, relative pose information of at least one other AR device with respect to the target AR device based on current respective pose information of the multiple AR devices;
a sending module 404, configured to generate, according to the determined relative pose information, relative pose prompt information for the target AR device, and send the relative pose prompt information to the target AR device, so as to present the relative pose prompt information of the at least one other AR device in a live-action image captured by the target AR device.
In one possible implementation, the first determining module 402, after determining the current pose information of each AR device, is further configured to:
and determining the distance information of the at least one other AR device relative to the target AR device based on the current pose information of each AR device, and sending the distance information to the target AR device for display.
In a possible implementation, the distance information of the at least one other AR device with respect to the target AR device includes a straight-line distance between the at least one other AR device and the target AR device, and/or a walking distance between the at least one other AR device and the target AR device.
In one possible implementation, the first determining module 402, when determining the current pose information of each AR device based on the live-action image of the target site captured by each AR device, is configured to:
and aiming at each AR device, matching the live-action image of the target place shot by the AR device with a pre-established three-dimensional model corresponding to the target place, and determining the current pose information of the AR device based on the matching result.
In one possible implementation, the second determining module 403, when determining the relative pose information of at least one other AR device with respect to the target AR device based on the current respective pose information of the plurality of AR devices, is configured to:
and determining at least one other AR device which is not in the live-action image shot by the target AR device from the plurality of AR devices based on the current respective pose information of the plurality of AR devices, and determining the relative pose information of the at least one other AR device relative to the target AR device.
In a possible implementation, the second determining module 403 is further configured to determine the target AR device according to the following method:
and determining any one AR device as the target AR device in response to a preset friend positioning trigger instruction generated by the user based on any one of the plurality of AR devices.
In one possible implementation, the sending module 404, when generating the relative pose hint information for the target AR device according to the determined relative pose information, is configured to:
determining a virtual indicator corresponding to the relative pose information based on the relative pose information of the at least one other AR device with respect to the target AR device;
and taking the virtual indicator corresponding to the relative pose information of each other AR device relative to the target AR device as the relative pose prompt information.
By the aid of the device, the current pose information of each AR device can be determined directly according to the live-action image shot by each AR device in the associated user group, and after the target AR device is determined, the relative pose prompt information of the target AR device and other AR devices to drink is displayed on the target AR device.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
Based on the same technical concept, the embodiment of the disclosure also provides computer equipment. Referring to fig. 5, a schematic structural diagram of a computer device 500 provided in the embodiment of the present disclosure includes a processor 501, a memory 502, and a bus 503. The memory 502 is used for storing execution instructions and includes a memory 5021 and an external memory 5022; the memory 5021 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 501 and data exchanged with an external storage 5022 such as a hard disk, the processor 501 exchanges data with the external storage 5022 through the memory 5021, and when the computer device 500 operates, the processor 501 communicates with the storage 502 through the bus 503, so that the processor 501 executes the following instructions:
acquiring a live-action image of a target place shot by each AR device in a plurality of AR devices corresponding to an associated user group;
determining current pose information of each AR device based on the live-action image of the target place shot by each AR device; the pose information comprises position information and orientation information;
for a target AR device in the plurality of AR devices, determining relative pose information of at least one other AR device relative to the target AR device based on the current respective pose information of the plurality of AR devices;
and generating relative pose prompt information for the target AR equipment according to the determined relative pose information, and sending the relative pose prompt information to the target AR equipment so as to present the relative pose prompt information of the at least one other AR equipment in a live-action image shot by the target AR equipment.
The embodiments of the present disclosure also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the positioning method described in the above method embodiments. The storage medium may be a volatile or non-volatile computer-readable storage medium.
The computer program product of the positioning method provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the positioning method described in the above method embodiments, which may be referred to specifically for the above method embodiments, and are not described herein again.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A method of positioning, comprising:
acquiring a live-action image of a target place shot by each AR device in a plurality of AR devices corresponding to an associated user group;
determining current pose information of each AR device based on the live-action image of the target place shot by each AR device; the pose information comprises position information and orientation information;
for a target AR device in the plurality of AR devices, determining relative pose information of at least one other AR device relative to the target AR device based on the current respective pose information of the plurality of AR devices;
and generating relative pose prompt information for the target AR equipment according to the determined relative pose information, and sending the relative pose prompt information to the target AR equipment so as to present the relative pose prompt information of the at least one other AR equipment in a live-action image shot by the target AR equipment.
2. The method of claim 1, wherein after determining the current pose information for each AR device, the method further comprises:
and determining the distance information of the at least one other AR device relative to the target AR device based on the current pose information of each AR device, and sending the distance information to the target AR device for display.
3. The method of claim 2, wherein the distance information of the at least one other AR device from the target AR device comprises a linear distance between the at least one other AR device and the target AR device, and/or a walking distance between the at least one other AR device and the target AR device.
4. The method of claim 1, wherein determining the current pose information of each AR device based on the live-action image of the target site taken by each AR device comprises:
and aiming at each AR device, matching the live-action image of the target place shot by the AR device with a pre-established three-dimensional model corresponding to the target place, and determining the current pose information of the AR device based on the matching result.
5. The method of claim 1, wherein determining relative pose information of at least one other AR device with respect to the target AR device based on current respective pose information of the plurality of AR devices comprises:
and determining at least one other AR device which is not in the live-action image shot by the target AR device from the plurality of AR devices based on the current respective pose information of the plurality of AR devices, and determining the relative pose information of the at least one other AR device relative to the target AR device.
6. The method of claim 1, wherein the target AR device is determined according to the following method:
and determining any one AR device as the target AR device in response to a preset friend positioning trigger instruction generated by the user based on any one of the plurality of AR devices.
7. The method of claim 1, wherein generating relative pose hint information for the target AR device based on the determined relative pose information comprises:
determining a virtual indicator corresponding to the relative pose information based on the relative pose information of the at least one other AR device with respect to the target AR device;
and taking the virtual indicator corresponding to the relative pose information of each other AR device relative to the target AR device as the relative pose prompt information.
8. A positioning device, comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a live-action image of a target place shot by each AR device in a plurality of AR devices corresponding to an associated user group;
the first determining module is used for determining the current pose information of each AR device based on the live-action image of the target place shot by each AR device; the pose information comprises position information and orientation information;
a second determining module, configured to determine, for a target AR device of the multiple AR devices, relative pose information of at least one other AR device with respect to the target AR device based on current respective pose information of the multiple AR devices;
and the sending module is used for generating relative pose prompt information for the target AR equipment according to the determined relative pose information and sending the relative pose prompt information to the target AR equipment so as to present the relative pose prompt information of the at least one other AR equipment in a live-action image shot by the target AR equipment.
9. A computer device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when a computer device is running, the machine-readable instructions when executed by the processor performing the steps of the positioning method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, performs the steps of the positioning method according to any one of claims 1 to 7.
CN202010514659.4A 2020-06-08 2020-06-08 Positioning method and device Active CN111698646B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010514659.4A CN111698646B (en) 2020-06-08 2020-06-08 Positioning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010514659.4A CN111698646B (en) 2020-06-08 2020-06-08 Positioning method and device

Publications (2)

Publication Number Publication Date
CN111698646A true CN111698646A (en) 2020-09-22
CN111698646B CN111698646B (en) 2022-10-18

Family

ID=72479830

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010514659.4A Active CN111698646B (en) 2020-06-08 2020-06-08 Positioning method and device

Country Status (1)

Country Link
CN (1) CN111698646B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112212865A (en) * 2020-09-23 2021-01-12 北京市商汤科技开发有限公司 Guiding method and device in AR scene, computer equipment and storage medium
CN112650422A (en) * 2020-12-17 2021-04-13 咪咕文化科技有限公司 AR interaction method and device of equipment, electronic equipment and storage medium
CN112817454A (en) * 2021-02-02 2021-05-18 深圳市慧鲤科技有限公司 Information display method and device, related equipment and storage medium
CN114625468A (en) * 2022-03-21 2022-06-14 北京字跳网络技术有限公司 Augmented reality picture display method and device, computer equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107220726A (en) * 2017-04-26 2017-09-29 消检通(深圳)科技有限公司 Fire-fighting equipment localization method, mobile terminal and system based on augmented reality
CN109974733A (en) * 2019-04-02 2019-07-05 百度在线网络技术(北京)有限公司 POI display methods, device, terminal and medium for AR navigation
CN110457414A (en) * 2019-07-30 2019-11-15 Oppo广东移动通信有限公司 Offline map processing, virtual objects display methods, device, medium and equipment
CN110462420A (en) * 2017-04-10 2019-11-15 蓝色视觉实验室英国有限公司 Alignment by union
CN110555876A (en) * 2018-05-30 2019-12-10 百度在线网络技术(北京)有限公司 Method and apparatus for determining position
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium
CN110794955A (en) * 2018-08-02 2020-02-14 广东虚拟现实科技有限公司 Positioning tracking method, device, terminal equipment and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110462420A (en) * 2017-04-10 2019-11-15 蓝色视觉实验室英国有限公司 Alignment by union
CN107220726A (en) * 2017-04-26 2017-09-29 消检通(深圳)科技有限公司 Fire-fighting equipment localization method, mobile terminal and system based on augmented reality
CN110555876A (en) * 2018-05-30 2019-12-10 百度在线网络技术(北京)有限公司 Method and apparatus for determining position
CN110794955A (en) * 2018-08-02 2020-02-14 广东虚拟现实科技有限公司 Positioning tracking method, device, terminal equipment and computer readable storage medium
CN109974733A (en) * 2019-04-02 2019-07-05 百度在线网络技术(北京)有限公司 POI display methods, device, terminal and medium for AR navigation
CN110457414A (en) * 2019-07-30 2019-11-15 Oppo广东移动通信有限公司 Offline map processing, virtual objects display methods, device, medium and equipment
CN110738737A (en) * 2019-10-15 2020-01-31 北京市商汤科技开发有限公司 AR scene image processing method and device, electronic equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112212865A (en) * 2020-09-23 2021-01-12 北京市商汤科技开发有限公司 Guiding method and device in AR scene, computer equipment and storage medium
CN112650422A (en) * 2020-12-17 2021-04-13 咪咕文化科技有限公司 AR interaction method and device of equipment, electronic equipment and storage medium
CN112817454A (en) * 2021-02-02 2021-05-18 深圳市慧鲤科技有限公司 Information display method and device, related equipment and storage medium
CN114625468A (en) * 2022-03-21 2022-06-14 北京字跳网络技术有限公司 Augmented reality picture display method and device, computer equipment and storage medium
CN114625468B (en) * 2022-03-21 2023-09-22 北京字跳网络技术有限公司 Display method and device of augmented reality picture, computer equipment and storage medium

Also Published As

Publication number Publication date
CN111698646B (en) 2022-10-18

Similar Documents

Publication Publication Date Title
CN111698646B (en) Positioning method and device
CN112148197A (en) Augmented reality AR interaction method and device, electronic equipment and storage medium
US9144744B2 (en) Locating and orienting device in space
CN106125903B (en) Multi-person interaction system and method
JP2019067383A5 (en)
JP5205187B2 (en) Input system and input method
KR101533320B1 (en) Apparatus for acquiring 3 dimension object information without pointer
WO2019019248A1 (en) Virtual reality interaction method, device and system
WO2015186436A1 (en) Image processing device, image processing method, and image processing program
CN111295234A (en) Method and system for generating detailed data sets of an environment via game play
CN112729327A (en) Navigation method, navigation device, computer equipment and storage medium
CN112179331B (en) AR navigation method, AR navigation device, electronic equipment and storage medium
CN111640203B (en) Image processing method and device
CN111665945B (en) Tour information display method and device
JP2014217566A (en) Hunting game distribution system
CN112419388A (en) Depth detection method and device, electronic equipment and computer readable storage medium
JP2005256232A (en) Method, apparatus and program for displaying 3d data
CN112882576A (en) AR interaction method and device, electronic equipment and storage medium
CN112181141A (en) AR positioning method, AR positioning device, electronic equipment and storage medium
CN116954367A (en) Virtual reality interaction method, system and equipment
US20190369735A1 (en) Method and system for inputting content
CN113010009B (en) Object sharing method and device
CN112788443B (en) Interaction method and system based on optical communication device
CN111665943B (en) Pose information display method and device
CN111638794A (en) Display control method and device for virtual cultural relics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant