CN110969704B - Mark generation tracking method and device based on AR guide - Google Patents

Mark generation tracking method and device based on AR guide Download PDF

Info

Publication number
CN110969704B
CN110969704B CN201911182161.6A CN201911182161A CN110969704B CN 110969704 B CN110969704 B CN 110969704B CN 201911182161 A CN201911182161 A CN 201911182161A CN 110969704 B CN110969704 B CN 110969704B
Authority
CN
China
Prior art keywords
terminal
mark
information
point
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911182161.6A
Other languages
Chinese (zh)
Other versions
CN110969704A (en
Inventor
王一男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xinshijie Technology Co ltd
Original Assignee
Beijing Xinshijie Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xinshijie Technology Co ltd filed Critical Beijing Xinshijie Technology Co ltd
Priority to CN201911182161.6A priority Critical patent/CN110969704B/en
Publication of CN110969704A publication Critical patent/CN110969704A/en
Application granted granted Critical
Publication of CN110969704B publication Critical patent/CN110969704B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images

Abstract

The application provides a mark generation tracking method and device based on AR guide, wherein the mark generation method comprises the following steps: acquiring coordinate information of the first terminal, and taking the coordinate information as target point coordinate information; selecting a preset 3D marking pattern; placing the 3D mark pattern above the target point to form an AR stereoscopic graph; binding the coordinate information of the target point with the AR three-dimensional graph to form an AR air mark information point; transmitting the AR air mark information point to a second terminal for tracking; the mark tracking method is used for determining the distance and the direction of the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and loading the distance and the direction of the second terminal and the target point on a terminal screen in an AR mode. According to the application, the target azimuth and distance are displayed in an AR form, so that the target can be rapidly positioned and tracked.

Description

Mark generation tracking method and device based on AR guide
Technical Field
The application belongs to the technical field of AR information processing, and particularly relates to a mark generation tracking method and device based on AR guidance.
Background
In daily life, more life scenes need to be marked on a certain place, so that people can find themselves conveniently, for example, people can find cars or see the places with strangers (such as receiving machines, people and express delivery).
In the prior art, a party with such a requirement can use a landmark building as a reference for searching a place, can make a mark of an actual object at a place, and can take a picture related to the place by a camera for reminding the party or sharing the picture to another party. However, this type of approach is not intuitive, and particularly for large landmark buildings, or where the actual object markers are obscured, and differences between photographs and real scenes, etc., can be troublesome to find the marker location.
In the existing digital mode, APP with a planar map or navigation function is used for making a place record mark, so that tracking or searching by other people is facilitated. However, many of these modes are displayed to the user in a planar (2D) map mode, and it is difficult for the user with poor sense of direction to determine the current position of the user and the azimuth of the point according to the location displayed in the map.
Disclosure of Invention
In order to solve at least one of the technical problems, the application provides a mark generation tracking method and device based on AR guide, which are convenient for the other party to track or search by generating AR air marks.
In a first aspect of the present application, a mark generation method based on AR guidance is applied to a first terminal for locating a target point, the mark generation method includes: acquiring coordinate information of the first terminal, and taking the coordinate information as target point coordinate information; selecting a preset 3D marking pattern; placing the 3D mark pattern above the target point to form an AR stereoscopic graph; binding the coordinate information of the target point with the AR three-dimensional graph to form an AR air mark information point; the AR air marker information point is sent to a second terminal for tracking.
Preferably, when forming the AR stereoscopic graph, the method further includes: and loading coordinate information of the target point on the 3D mark pattern.
Preferably, when forming the AR stereoscopic graph, the method further includes: the time when the AR stereoscopic graphic is formed is acquired and loaded onto the 3D marker pattern.
Preferably, before sending the AR air mark information point to the second terminal for tracking, the method further comprises: and selecting a marking mode, wherein the marking mode comprises a fixed positioning marking mode or a mobile positioning marking mode, and under the mobile positioning marking mode, updating the coordinate information of the first terminal according to a preset time interval, generating an AR air marking information point and sending the AR air marking information point to the second terminal.
In a second aspect of the present application, a marker tracking method based on AR navigation is applied to a second terminal for tracking a target point, the marker tracking method includes: acquiring coordinate information of a target point and an AR three-dimensional graph according to an AR air mark information point sent by a first terminal for positioning the target;
acquiring coordinate information of the second terminal and second terminal screen orientation information; moving the second terminal screen orientation, and loading the AR stereoscopic graph to a target point in a screen of a second terminal when the coordinate information of the target point is located in the coverage range of the second terminal screen orientation; and determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and placing the distance on the AR stereograph.
Preferably, the acquiring the coordinate information of the second terminal and the second terminal screen orientation information includes: acquiring the screen orientation of the second terminal through a gyroscope; and acquiring longitude and latitude coordinates of the second terminal through a GPS or a network.
Preferably, loading the AR stereoscopic graphic at a target point within a screen of the second terminal includes: calculating the offset of the coordinate information of the target point relative to the second terminal; and determining a loading position of the target point on the screen of the second terminal according to the offset by taking the second terminal as an origin, and displaying the AR stereo graph at the loading position.
In a third aspect of the present application, a marker generation device based on AR guidance, applied to a first terminal for locating a target point, includes: the target positioning module is used for acquiring the coordinate information of the first terminal and taking the coordinate information as the coordinate information of a target point; the marking pattern selection module is used for selecting a preset 3D marking pattern; the AR three-dimensional graph generating module is used for placing the 3D mark pattern above the target point to form an AR three-dimensional graph; the AR air mark information point generation module is used for binding the coordinate information of the target point with the AR stereo graph to form an AR air mark information point; and the communication module is used for sending the AR air mark information point to a second terminal for tracking.
Preferably, the method further comprises a marking mode selection module, wherein the marking mode selection module is used for selecting a marking direction before the AR air marking information point is sent to the second terminal for tracking, the marking mode comprises a fixed positioning marking mode or a mobile positioning marking mode, and the coordinate information of the first terminal is updated according to a preset time interval in the mobile positioning marking mode, so that the AR air marking information point is generated and sent to the second terminal.
In a fourth aspect of the present application, a marker-tracking device based on AR navigation is applied to a second terminal for tracking a target point, the marker-tracking device comprising: the target point acquisition module acquires coordinate information and an AR stereoscopic graph of the target point according to an AR air mark information point sent by a first terminal for positioning the target; the AR information initializing module is used for acquiring the coordinate information of the second terminal and the screen orientation information of the second terminal; the AR stereoscopic graph loading module is used for moving the second terminal screen orientation, and loading the AR stereoscopic graph to a target point in a screen of a second terminal when the coordinate information of the target point is located in the coverage range of the second terminal screen orientation; and the target point distance loading module is used for determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and placing the distance on the AR three-dimensional graph.
The application realizes the mark generation and mark tracking searching method based on AR guide, and can rapidly locate and track the target by erecting the digital AR air mark in the air and sharing the erected digital AR air mark to a third party and displaying the target azimuth and distance in an AR mode.
Drawings
FIG. 1 is a flow chart of a preferred embodiment of the AR guide-based mark generation method of the present application.
FIG. 2 is a flow chart of a preferred embodiment of the AR guide-based marker tracking method of the present application.
FIG. 3 is a schematic diagram of vehicle location tracking in accordance with the embodiment of FIG. 2.
Fig. 4 is a schematic diagram of setting up an over-the-air SOS tag and viewing the tag according to the embodiment of fig. 2.
Fig. 5 is a person positioning schematic diagram of the embodiment of fig. 2 according to the present application.
Fig. 6 is a frame diagram of a preferred embodiment of the AR guide-based mark generating device of the present application.
FIG. 7 is a block diagram of a preferred embodiment of an AR guide-based marker-tracking device of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application become more apparent, the technical solutions in the embodiments of the present application will be described in more detail with reference to the accompanying drawings in the embodiments of the present application. In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are some, but not all, embodiments of the application. The embodiments described below by referring to the drawings are exemplary and intended to illustrate the present application and should not be construed as limiting the application. All other embodiments, based on the embodiments of the application, which are apparent to those of ordinary skill in the art without inventive faculty, are intended to be within the scope of the application. Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
According to a first aspect of the present application, as shown in fig. 1, a mark generation method based on AR guidance is applied to a first terminal for locating a target point, and the mark generation method includes:
step S11, acquiring coordinate information of the first terminal, and taking the coordinate information as target point coordinate information;
step S12, selecting a preset 3D marking pattern;
step S13, placing the 3D mark pattern above the target point to form an AR three-dimensional figure;
step S14, binding the coordinate information of the target point with the AR three-dimensional graph to form an AR air mark information point;
and step S15, the AR air mark information point is sent to a second terminal for tracking.
Before step S15, the user may choose to create a normal mark, a distress mark or a real-time mark by clicking a button for issuing a mark on a screen of the smart mobile device (e.g. a mobile phone). For the distress mark, as the AR air mark can penetrate any blocking object, the application can be applied to field rescue, trapped people can set up an air SOS mark, and the trapped people can intuitively see the AR air mark of the trapped people so as to determine the direction.
In some optional embodiments, when forming the AR stereoscopic graph, the method further includes: and loading coordinate information of the target point on the 3D mark pattern.
In some optional embodiments, when forming the AR stereoscopic graph, the method further includes: the time when the AR stereoscopic graphic is formed is acquired and loaded onto the 3D marker pattern.
Fig. 4 shows an AR stereoscopic pattern in which SOS search and rescue is performed, and when the first terminal performs the mark generation step, time may be bound to the AR stereoscopic pattern in response to a user's request.
On the other hand, the AR stereograph can also load custom text information, such as SOS help seeking information and the like.
In some alternative embodiments, before sending the AR air marker information point to the second terminal for tracking, comprising: and selecting a marking mode, wherein the marking mode comprises a fixed positioning marking mode or a mobile positioning marking mode, and under the mobile positioning marking mode, updating the coordinate information of the first terminal according to a preset time interval, generating an AR air marking information point and sending the AR air marking information point to the second terminal.
The first terminal or the second terminal of the application can be a mobile phone, a tablet or a wearable mobile device, wherein the second terminal can be the first terminal, the first terminal and the second terminal are communicated through an internal bus of the terminal, and the second terminal can also be other terminal equipment different from the second terminal, and the second terminal are communicated through a network. In the first case, after the first terminal performs steps S11 to S15, the formed AR air mark information points are stored inside the terminal, which is equivalent to being transmitted to the second terminal, and then the terminal can be moved to other positions.
As can be seen from the vehicle positioning and tracking schematic diagram in the embodiment shown in fig. 3, after the user parks the vehicle to a fixed position, the user performs positioning through the (first) terminal to generate an AR air mark information point, only the coordinate information of the vehicle is stored in the information point, when the user selects the mark mode, the AR air mark information point is stored (equivalently, sent to the second terminal) by adopting the fixed positioning mark mode, at this time, the coordinate information of the AR air mark information point is separated from the coordinate information of the first terminal, the first terminal can move along with the user, and then, when the mark tracking is performed, the (second) terminal performs tracking and searching for the AR air mark information point.
In the present application, in view of positioning accuracy, the coordinate information generally represents a certain area, and may be a two-dimensional area or a three-dimensional area. Under the two-dimensional condition, different interest areas can be marked and distinguished according to the information of longitude and latitude coordinates; under the three-dimensional condition, the height information can be added on the basis of the longitude and latitude coordinates, so that the areas at the same longitude and latitude coordinates and different heights can be divided into different areas, and more accurate area division is realized.
In an embodiment, when the second terminal is different from the first terminal, the mobile positioning mark mode may be selected to generate the AR air mark information point, for example, in a scene of meeting a stranger, the first terminal updates the coordinate information of the first terminal according to a predetermined time interval, and generates the AR air mark information point in real time and sends the AR air mark information point to the second terminal.
It should be understood that the first terminal is opposite to the second terminal, for example, a mobile phone terminal, which may be used as the first terminal or may be used as the second terminal, which is related to opening the corresponding app function module, and different modules are adopted to correspond to different terminals, where in the scenario that the first person is seen by a stranger, the first person performs a mark generation step of the so-called first terminal through the app's coordinate generation module, the second person may also perform a mark generation step of the so-called first terminal through the app's coordinate generation module, after the two parties communicate, send own AR air mark information points to the other party, and then the first person performs a mark tracking step of the so-called second terminal through the app's tracking module to find and track the second person, and similarly, the second person may also perform a mark tracking step of the so-called second terminal through the app's tracking module of the mobile phone app thereof to find and track the first person.
The 3D marking pattern of step S12 of the present application may be an arrow+circle positioning pattern, as shown in fig. 3-5, in which the air marking will float at a predetermined height in the air in a preset shape and color; the user, after leaving the current location, may use the application to view the air tag in the air and may share the air tag with a third party.
The first terminal and the second terminal can directly interact with each other or interact with each other through the server, related functions can be integrated into corresponding apps, different functions are started through different buttons, AR air mark information points in SOS help seeking are generally sent to the second terminal through the server to realize point-to-point transmission, in alternative embodiments, broadcasting can be performed through the server, the server sets a receiving range according to the position information of the target point of the first terminal, for example, a user within 3km from a trapped person can receive related information as the second terminal.
In a second aspect of the present application, a marker tracking method based on AR navigation is applied to a second terminal for tracking a target point, as shown in fig. 2, and the marker tracking method includes:
step S21, acquiring coordinate information and AR stereograph of a target point according to an AR air mark information point sent by a first terminal for positioning the target;
step S22, acquiring coordinate information of the second terminal and second terminal screen orientation information;
step S23, moving the second terminal screen orientation, and loading the AR stereograph to a target point in a screen of a second terminal when the coordinate information of the target point is located in the coverage range of the second terminal screen orientation;
and step S24, determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and placing the distance on the AR three-dimensional graph.
In this embodiment, after the second terminal turns on the App, the gyroscope and GPS information of the terminal are acquired and AR scene initialization is performed: according to the gyroscope and GPS data, pointing to the east by the x-axis, pointing to the gravity opposite direction by the y-axis, and pointing to the south by the z-axis to establish an AR three-dimensional scene, so that the equipment position is (0, 0) point when the equipment is started, and the initialization is completed. After the initialization is completed, the second terminal can view the air marks erected by the first terminal.
In an alternative embodiment, after the initialization is completed, the terminal may select to view the guide information or the air mark through the screening button. When the empty mark is selected to be checked, the terminal requests the server to acquire AR air mark information with a certain distance (configurable by the server) around the current terminal GPS coordinate as a center, namely AR air mark information of a plurality of first terminals can be acquired as shown in fig. 5. And calculating offset of an x axis and a z axis according to the GPS coordinates of the AR air marks returned by the server and the GPS coordinates of the current equipment, calculating the x and z coordinates of the air mark information points according to the offset and the position of the current equipment in a scene, calculating a y coordinate according to the difference between the altitude of the air marks returned by the server and the altitude of the current terminal, and finally placing the AR graph of the air marks at the calculated xyz coordinate position.
When the air mark is found, the application program calculates the position of the AR graph through the gyroscope and the GPS at the intelligent mobile equipment (such as a mobile phone), and when the intelligent mobile equipment (such as the mobile phone) faces the position, the AR graph and the distance between the terminal and the air mark are displayed on the screen at the intelligent mobile equipment (such as the mobile phone). If the air tag is a distress type, the first terminal or the server increases the time for displaying the SOS graphic pattern and creating the tag based on the common tag.
In some optional embodiments, acquiring the coordinate information of the second terminal and the second terminal screen orientation information includes: acquiring the screen orientation of the second terminal through a gyroscope; and acquiring longitude and latitude coordinates of the second terminal through a GPS or a network.
In some alternative embodiments, loading the AR stereoscopic graphic at a target point within a screen of the second terminal comprises: calculating the offset of the coordinate information of the target point relative to the second terminal; and determining a loading position of the target point on the screen of the second terminal according to the offset by taking the second terminal as an origin, and displaying the AR stereo graph at the loading position.
In a third aspect of the present application, there is provided a mark generation device based on AR navigation, corresponding to the above mark generation method, applied to a first terminal for locating a target point, as shown in fig. 6, the mark generation device comprising: the target positioning module is used for acquiring the coordinate information of the first terminal and taking the coordinate information as the coordinate information of a target point; the marking pattern selection module is used for selecting a preset 3D marking pattern; the AR three-dimensional graph generating module is used for placing the 3D mark pattern above the target point to form an AR three-dimensional graph; the AR air mark information point generation module is used for binding the coordinate information of the target point with the AR stereo graph to form an AR air mark information point; and the communication module is used for sending the AR air mark information point to a second terminal for tracking.
In some optional embodiments, the method further includes a marking mode selection module, configured to perform marking direction selection before the AR air marking information point is sent to the second terminal for tracking, where the marking mode includes a fixed positioning marking mode or a mobile positioning marking mode, and in the mobile positioning marking mode, the coordinate information of the first terminal is updated at a predetermined time interval, and the AR air marking information point is generated and sent to the second terminal.
In a fourth aspect of the present application, there is provided an AR guide-based marker-tracking device, corresponding to the above marker-tracking, applied to a second terminal for tracking a target point, as shown in fig. 7, the marker-tracking device comprising: the target point acquisition module acquires coordinate information and an AR stereoscopic graph of the target point according to an AR air mark information point sent by a first terminal for positioning the target; the AR information initializing module is used for acquiring the coordinate information of the second terminal and the screen orientation information of the second terminal; the AR stereoscopic graph loading module is used for moving the second terminal screen orientation, and loading the AR stereoscopic graph to a target point in a screen of a second terminal when the coordinate information of the target point is located in the coverage range of the second terminal screen orientation; and the target point distance loading module is used for determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and placing the distance on the AR three-dimensional graph.
The application also provides a terminal device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the processor executing the computer program for implementing the AR guide-based marker generation tracking method.
The present application also provides a readable storage medium storing a computer program for implementing an AR guide-based mark generation tracking method as described above when executed by a processor. The computer-readable storage medium may be contained in the apparatus described in the above embodiment; or may be present alone without being fitted into the device. The computer readable storage medium carries one or more programs which, when executed by the apparatus, process data as described above.
A terminal device suitable for use in implementing embodiments of the present application, such as a mobile device such as a cellular phone, includes a Central Processing Unit (CPU) that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) or a program loaded from a storage section into a Random Access Memory (RAM). In the RAM, various programs and data required for the operation of the device are also stored. The CPU, ROM and RAM are connected to each other by a bus. An input/output (I/O) interface is also connected to the bus.
The following components are connected to the I/O interface: input parts including touch screens, keys, scanning/photographing, etc.; an output section including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), etc., and a speaker, etc.; a storage section including a hard disk or the like; and a communication section including a network interface card such as a LAN card, a modem, and the like. The communication section performs communication processing via a network such as the internet.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via a communication portion, and/or installed from a removable medium. The above-described functions defined in the method of the present application are performed when the computer program is executed by a Central Processing Unit (CPU).
The computer storage medium of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules or units described in the embodiments of the present application may be implemented by software, or may be implemented by hardware. The modules or units described may also be provided in a processor, the names of which do not in some cases constitute a limitation of the module or unit itself.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the scope of the present application should be included in the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (7)

1. A mark generation method based on AR guidance, applied to a first terminal for locating a target point, the mark generation method comprising:
acquiring coordinate information of the first terminal, and taking the coordinate information as target point coordinate information;
selecting a preset 3D marking pattern;
placing the 3D mark pattern above the target point to form an AR stereoscopic graph, and when the AR stereoscopic graph is formed, further comprising: loading coordinate information of the target point onto the 3D mark pattern, acquiring time when the AR stereoscopic graph is formed, and loading the coordinate information onto the 3D mark pattern;
binding the coordinate information of the target point with the AR three-dimensional graph to form an AR air mark information point;
the AR air marker information point is sent to a second terminal for tracking.
2. The AR guide-based mark generation method of claim 1, wherein prior to transmitting the AR air mark information point to the second terminal for tracking comprises:
and selecting a marking mode, wherein the marking mode comprises a fixed positioning marking mode or a mobile positioning marking mode, and under the mobile positioning marking mode, updating the coordinate information of the first terminal according to a preset time interval, generating an AR air marking information point and sending the AR air marking information point to the second terminal.
3. The mark tracking method based on AR guide is applied to a second terminal for tracking a target point, and is characterized by comprising the following steps:
acquiring coordinate information of a target point and an AR three-dimensional graph according to an AR air mark information point sent by a first terminal for positioning the target;
acquiring coordinate information of the second terminal and second terminal screen orientation information;
moving the second terminal screen orientation, and loading the AR stereoscopic graph to a target point in a screen of a second terminal when the coordinate information of the target point is located in the coverage range of the second terminal screen orientation;
determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, and placing the distance on the AR stereograph, wherein the method comprises the steps of initializing an AR scene based on the gyroscope and GPS information of the second terminal: according to the gyroscope and GPS data, pointing to the east by the x-axis, pointing to the gravity in the opposite direction by the y-axis, establishing an AR three-dimensional scene by pointing to the south by the z-axis, taking the equipment position as a (0, 0) point when starting, finishing initialization, when the second terminal selects to check an empty mark through a screening button, requesting to acquire an AR air mark information point with the GPS coordinate of the current terminal as the center and a certain distance nearby as a radius by the second terminal, calculating offset of the x-axis and the z-axis according to the AR air mark GPS coordinate returned by the service terminal and the GPS coordinate of the current equipment, calculating x-coordinate and z-coordinate of the AR air mark information point according to the offset and the position of the current second terminal equipment in the scene, calculating y-coordinate according to the difference between the altitude of the AR air mark information point returned by the service terminal and the altitude of the current second terminal, and finally placing the AR three-dimensional graph of the AR air mark information point at the calculated xyz-coordinate position.
4. The AR guide-based mark tracking method as claimed in claim 3, wherein the acquiring the coordinate information of the second terminal and the second terminal screen orientation information comprises:
acquiring the screen orientation of the second terminal through a gyroscope;
and acquiring longitude and latitude coordinates of the second terminal through a GPS or a network.
5. A mark generation device based on AR guidance, applied to a first terminal for locating a target point, the mark generation device comprising:
the target positioning module is used for acquiring the coordinate information of the first terminal and taking the coordinate information as the coordinate information of a target point;
the marking pattern selection module is used for selecting a preset 3D marking pattern;
the AR stereoscopic graph generating module is used for placing the 3D mark pattern above the target point to form an AR stereoscopic graph, and when the AR stereoscopic graph is formed, the AR stereoscopic graph generating module further comprises: loading coordinate information of the target point onto the 3D mark pattern, acquiring time when the AR stereoscopic graph is formed, and loading the coordinate information onto the 3D mark pattern;
the AR air mark information point generation module is used for binding the coordinate information of the target point with the AR stereo graph to form an AR air mark information point;
and the communication module is used for sending the AR air mark information point to a second terminal for tracking.
6. The AR-guide-based mark generation apparatus as set forth in claim 5, further comprising a mark pattern selection module for performing mark direction selection before the AR air mark information point is transmitted to the second terminal for tracking, wherein the mark pattern includes a fixed positioning mark pattern or a mobile positioning mark pattern in which coordinate information of the first terminal is updated at predetermined time intervals, and AR air mark information points are generated and transmitted to the second terminal.
7. A marker-tracking device based on AR guidance, applied to a second terminal for tracking a target point, the marker-tracking device comprising:
the target point acquisition module acquires coordinate information and an AR stereoscopic graph of the target point according to an AR air mark information point sent by a first terminal for positioning the target;
the AR information initializing module is used for acquiring the coordinate information of the second terminal and the screen orientation information of the second terminal;
the AR stereoscopic graph loading module is used for moving the second terminal screen orientation, and loading the AR stereoscopic graph to a target point in a screen of a second terminal when the coordinate information of the target point is located in the coverage range of the second terminal screen orientation;
the target point distance loading module is used for determining the distance between the second terminal and the target point according to the coordinate information of the second terminal and the coordinate information of the target point, placing the distance on the AR three-dimensional graph, and initializing an AR scene based on the gyroscope and GPS information of the second terminal: according to the gyroscope and GPS data, pointing to the east by the x-axis, pointing to the gravity in the opposite direction by the y-axis, establishing an AR three-dimensional scene by pointing to the south by the z-axis, taking the equipment position as a (0, 0) point when starting, finishing initialization, when the second terminal selects to check an empty mark through a screening button, requesting to acquire an AR air mark information point with the GPS coordinate of the current terminal as the center and a certain distance nearby as a radius by the second terminal, calculating offset of the x-axis and the z-axis according to the AR air mark GPS coordinate returned by the service terminal and the GPS coordinate of the current equipment, calculating x-coordinate and z-coordinate of the AR air mark information point according to the offset and the position of the current second terminal equipment in the scene, calculating y-coordinate according to the difference between the altitude of the AR air mark information point returned by the service terminal and the altitude of the current second terminal, and finally placing the AR three-dimensional graph of the AR air mark information point at the calculated xyz-coordinate position.
CN201911182161.6A 2019-11-27 2019-11-27 Mark generation tracking method and device based on AR guide Active CN110969704B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911182161.6A CN110969704B (en) 2019-11-27 2019-11-27 Mark generation tracking method and device based on AR guide

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911182161.6A CN110969704B (en) 2019-11-27 2019-11-27 Mark generation tracking method and device based on AR guide

Publications (2)

Publication Number Publication Date
CN110969704A CN110969704A (en) 2020-04-07
CN110969704B true CN110969704B (en) 2023-09-22

Family

ID=70031806

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911182161.6A Active CN110969704B (en) 2019-11-27 2019-11-27 Mark generation tracking method and device based on AR guide

Country Status (1)

Country Link
CN (1) CN110969704B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111521193A (en) * 2020-04-23 2020-08-11 广东博智林机器人有限公司 Live-action navigation method, live-action navigation device, storage medium and processor
CN114158022A (en) * 2021-12-07 2022-03-08 阿维塔科技(重庆)有限公司 Feed rescue method, device, system and equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598504A (en) * 2014-05-15 2015-05-06 腾讯科技(深圳)有限公司 Information display control method and device for electronic map
CN107077739A (en) * 2017-01-23 2017-08-18 香港应用科技研究院有限公司 Use the three dimensional indicia model construction and real-time tracking of monocular camera
CN107566793A (en) * 2017-08-31 2018-01-09 中科云创(北京)科技有限公司 Method, apparatus, system and electronic equipment for remote assistance
CN108830894A (en) * 2018-06-19 2018-11-16 亮风台(上海)信息科技有限公司 Remote guide method, apparatus, terminal and storage medium based on augmented reality
CN109618055A (en) * 2018-12-25 2019-04-12 维沃移动通信有限公司 A kind of position sharing method and mobile terminal
CN109887003A (en) * 2019-01-23 2019-06-14 亮风台(上海)信息科技有限公司 A kind of method and apparatus initialized for carrying out three-dimensional tracking

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10339713B2 (en) * 2015-08-12 2019-07-02 International Business Machines Corporation Marker positioning for augmented reality overlays

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598504A (en) * 2014-05-15 2015-05-06 腾讯科技(深圳)有限公司 Information display control method and device for electronic map
CN107077739A (en) * 2017-01-23 2017-08-18 香港应用科技研究院有限公司 Use the three dimensional indicia model construction and real-time tracking of monocular camera
CN107566793A (en) * 2017-08-31 2018-01-09 中科云创(北京)科技有限公司 Method, apparatus, system and electronic equipment for remote assistance
CN108830894A (en) * 2018-06-19 2018-11-16 亮风台(上海)信息科技有限公司 Remote guide method, apparatus, terminal and storage medium based on augmented reality
CN109618055A (en) * 2018-12-25 2019-04-12 维沃移动通信有限公司 A kind of position sharing method and mobile terminal
CN109887003A (en) * 2019-01-23 2019-06-14 亮风台(上海)信息科技有限公司 A kind of method and apparatus initialized for carrying out three-dimensional tracking

Also Published As

Publication number Publication date
CN110969704A (en) 2020-04-07

Similar Documents

Publication Publication Date Title
US9429438B2 (en) Updating map data from camera images
US10014939B2 (en) Smart device performing LED-ID/RF communication through a camera, and system and method for providing location-based services using the same
US11776185B2 (en) Server, user terminal, and service providing method, and control method thereof for displaying photo images within a map
CN103424113B (en) Indoor positioning and navigating method of mobile terminal based on image recognition technology
CA2762743C (en) Updating map data from camera images
CN107179524B (en) Fire fighting equipment positioning method, device and system and computer readable storage medium
JP2003111128A (en) Method of specifying present location, method of providing information on present location, method of guiding moving route, position information management system, and information communication terminal
CN110443850B (en) Target object positioning method and device, storage medium and electronic device
CN107656961B (en) Information display method and device
KR20190059120A (en) Facility Inspection System using Augmented Reality based on IoT
CN110969704B (en) Mark generation tracking method and device based on AR guide
WO2020055281A1 (en) Method and system of forming mixed-reality images
JP2019027799A (en) Positioning accuracy information calculation device and positioning accuracy information calculation method
US9488489B2 (en) Personalized mapping with photo tours
CN102946476B (en) Rapid positioning method and rapid positioning device
CN112422653A (en) Scene information pushing method, system, storage medium and equipment based on location service
CN111028516A (en) Traffic police duty information transmission method, system, medium and device
CN110796706A (en) Visual positioning method and system
RU2660631C1 (en) Combined reality images formation method and system
CN110942521B (en) AR information point display method and device
CN113237464A (en) Positioning system, positioning method, positioner, and storage medium
CN114422644A (en) Device control method, device, user equipment and computer readable storage medium
KR101298071B1 (en) Destination route guidance method and system
CN111814824A (en) Method, device, server and system for acquiring association relationship
CN110264521A (en) A kind of localization method and system based on binocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant