CN112825198B - Mobile tag display method, device, terminal equipment and readable storage medium - Google Patents

Mobile tag display method, device, terminal equipment and readable storage medium Download PDF

Info

Publication number
CN112825198B
CN112825198B CN201911150100.1A CN201911150100A CN112825198B CN 112825198 B CN112825198 B CN 112825198B CN 201911150100 A CN201911150100 A CN 201911150100A CN 112825198 B CN112825198 B CN 112825198B
Authority
CN
China
Prior art keywords
scene
mobile device
target camera
coordinates
mobile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911150100.1A
Other languages
Chinese (zh)
Other versions
CN112825198A (en
Inventor
许红锦
史有华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN201911150100.1A priority Critical patent/CN112825198B/en
Publication of CN112825198A publication Critical patent/CN112825198A/en
Application granted granted Critical
Publication of CN112825198B publication Critical patent/CN112825198B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a mobile tag display method, a mobile tag display device, a terminal device and a readable storage medium, wherein an Augmented Reality (AR) scene of a target camera is constructed according to position information of the selected target camera, a proportional relation between the AR scene and an actual scene is recorded, and then AR mapping coordinates of each mobile device in the AR scene are determined according to device position information of each mobile device, position information of the target camera and the proportional relation, so that AR tags of each mobile device are displayed in the AR scene. Therefore, the mobile device labels can be automatically accessed and displayed in batches in the AR live pictures of the target camera, and the actual scene is simulated by the AR scene, so that the mobile device labels are more flexible and lighter than the front-end equipment side, do not need to depend on the front-end equipment, are not limited by hardware, do not depend on camera processing, and can be stored in the code stream of the camera without being coupled with the camera, thereby improving the real-time performance of label display.

Description

Mobile tag display method, device, terminal equipment and readable storage medium
Technical Field
The present invention relates to the field of monitoring technologies, and in particular, to a mobile tag display method, a mobile tag display device, a terminal device, and a readable storage medium.
Background
Currently, a scene of real-time video enhancement is increasingly applied to the field of video monitoring by utilizing three-dimensional modeling, and the augmented reality refers to adding new information to an image in a real-time or near real-time manner by using computer generated enhancement, and tagging and virtualizing elements in the real world so as to overlay and present tagged and virtualized element sleeves on a display interface in the real world.
Along with the development of technology, the technology of superposing AR (Augmented Reality ) labels on camera videos has wide application prospects in the field of urban monitoring, but in the current scheme, only fixed point positions are marked, so that the current command and dispatch effect is difficult to well meet.
Disclosure of Invention
In view of the foregoing, an object of the present application is to provide a mobile tag display method, apparatus, terminal device, and readable storage medium, capable of automatically accessing and displaying a mobile device tag in an AR live screen of a target camera.
According to an aspect of the present application, there is provided a mobile tag display method, applied to a terminal device, the method including:
obtaining equipment information of each mobile equipment accessed in a monitoring scene, wherein the equipment information comprises equipment position information;
Constructing an Augmented Reality (AR) scene of a target camera according to the position information of the selected target camera, and recording a proportional relationship between the AR scene and an actual scene, wherein the AR scene comprises a virtual camera corresponding to the target camera;
determining AR mapping coordinates of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relation;
and displaying the AR label of each mobile device in the AR scene according to the AR mapping coordinates of each mobile device in the AR scene.
In one possible implementation, the step of constructing an augmented reality AR scene of the target camera according to the location information of the selected target camera includes:
according to the position information of the target camera, constructing a three-dimensional simulation scene corresponding to the target camera according to a preset three-dimensional drawing protocol;
monitoring the working parameter change of the target camera, and controlling the virtual camera to execute synchronous actions corresponding to the target camera in the three-dimensional simulation scene according to the working parameter change so as to construct an Augmented Reality (AR) scene of the target camera.
In a possible implementation manner, the step of determining the AR mapping coordinates of each mobile device in the AR scene according to the device location information of each mobile device, the location information of the target camera and the scaling relationship includes:
for each mobile device, determining a distance relation between the real coordinates of the mobile device and the mapping coordinates of the target camera in a set plane coordinate system according to the device position information of the mobile device and the position information of the target camera;
determining an angle relation between the real coordinates of the mobile device and the mapping coordinates of the target camera in a set plane coordinate system;
and determining AR mapping coordinates of each mobile device in the AR scene according to the distance relation, the angle relation and the proportional relation.
In a possible implementation manner, the real coordinates of the mobile device are located in the set plane coordinate system, and the step of determining the angular relationship between the real coordinates of the mobile device and the mapped coordinates of the target camera in the set plane coordinate system includes:
calculating a sine angle relation between the real coordinates of the mobile equipment and the mapping coordinates of the target camera in the direction of a first coordinate axis in the set plane coordinate system;
Calculating cosine angle relation between the real coordinates of the mobile equipment and the mapping coordinates of the target camera in the direction of a second coordinate axis in the set plane coordinate system;
the first coordinate axis and the second coordinate axis are perpendicular to each other, and the sine angle relation and the cosine angle relation form an angle relation between a real coordinate of the mobile device and a mapping coordinate of the target camera in a set plane coordinate system.
In a possible implementation manner, the step of determining the AR mapping coordinates of each mobile device in the AR scene according to the distance relationship, the angle relationship and the proportional relationship includes:
determining an AR distance relation between the AR coordinates of the mobile device and the AR coordinates of the virtual camera in the AR scene according to the distance relation and the proportional relation;
determining a first AR coordinate of the virtual camera in the AR scene, and determining an AR mapping point coordinate of the first AR coordinate in the set plane coordinate system;
determining a position relationship between the mobile device and the target camera in the AR scene according to the AR mapping point coordinates, the angle relationship and the AR distance relationship;
And calculating second AR coordinates of the mobile device in the AR scene according to the position relation between the mobile device and the target camera in the AR scene so as to determine AR mapping coordinates of the mobile device in the AR scene.
In one possible embodiment, the method further comprises:
when the working parameters of the target camera are detected to change, the working state of the virtual camera is adjusted according to the changed working parameters;
and correspondingly adjusting the display position of the AR label of each mobile device in the AR scene according to the adjusted working state of the virtual camera.
In one possible embodiment, the method further comprises:
and when detecting a new mobile device in the monitored scene of the target camera, adding the display position of the AR label of the new mobile device in the AR scene.
According to another aspect of the present application, there is provided a mobile tag display apparatus applied to a terminal device, the apparatus comprising:
the acquisition module is used for acquiring equipment information of each mobile equipment accessed in the monitoring scene, wherein the equipment information comprises equipment position information;
the building module is used for building an Augmented Reality (AR) scene of the target camera according to the position information of the selected target camera and recording the proportional relation between the AR scene and the actual scene, wherein the AR scene comprises a virtual camera corresponding to the target camera;
The determining module is used for determining AR mapping coordinates of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relation;
and the tag display module is used for displaying the AR tag of each mobile device in the AR scene according to the AR mapping coordinates of each mobile device in the AR scene.
According to another aspect of the present application, there is provided a terminal device including a machine-readable storage medium storing machine-executable instructions and a processor, which when executing the machine-executable instructions, implements the mobile tag display method described above.
According to another aspect of the present application, there is provided a readable storage medium having stored therein machine-executable instructions that when executed implement the aforementioned mobile tag display method.
Based on any one of the above aspects, the present application constructs an augmented reality AR scene of a target camera according to the selected position information of the target camera, records a proportional relationship between the AR scene and an actual scene, and then determines AR mapping coordinates of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relationship, so as to display an AR tag of each mobile device in the AR scene. Therefore, the mobile device labels can be automatically accessed and displayed in batches in the AR live pictures of the target camera, and the actual scene is simulated by the AR scene, so that the mobile device labels are more flexible and lighter than the front-end equipment side, do not need to depend on the front-end equipment, are not limited by hardware, do not depend on camera processing, and can be stored in the code stream of the camera without being coupled with the camera, thereby improving the real-time performance of label display.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows an application scenario schematic diagram of a mobile tag display method provided in an embodiment of the present application;
fig. 2 illustrates one of flow diagrams of a mobile tag display method according to an embodiment of the present application;
fig. 3 shows a schematic flow chart of substeps of step S120 shown in fig. 2;
fig. 4 shows a schematic flow chart of the substeps of step S130 shown in fig. 2;
FIG. 5 illustrates one of the schematic diagrams of determining the positional relationship between the mobile device and the virtual camera provided in the embodiments of the present application;
FIG. 6 illustrates a second schematic diagram of determining a positional relationship between a mobile device and the virtual camera according to an embodiment of the present application;
FIG. 7 is a second flow chart of a mobile tag display method according to an embodiment of the present disclosure;
FIG. 8 is a third flow chart illustrating a mobile tag display method according to an embodiment of the present disclosure;
fig. 9 is a schematic functional block diagram of a mobile tag display device according to an embodiment of the present application;
fig. 10 is a schematic block diagram of a terminal device for implementing the mobile tag display method according to the embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the accompanying drawings in the present application are only for the purpose of illustration and description, and are not intended to limit the protection scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this application, illustrates operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to the flow diagrams and one or more operations may be removed from the flow diagrams as directed by those skilled in the art.
In addition, the described embodiments are only some, but not all, of the embodiments of the present application. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
Aiming at the technical problems mentioned in the background art, the automatic tag adding algorithm currently applied to the AR live image mainly converts point position information to be marked into PT value coordinates (wherein P refers to a horizontal angle and T refers to a pitch angle) relative to the image capturing device, calculates coordinates in a corresponding plane coordinate system through the PT values for display, and superimposes tags into the image of the image capturing device, wherein the generated tag information of the monitoring point is usually stored in a code stream of the image capturing device, so that the image can still present the tag information of the monitoring point when the camera in the image capturing device is restarted.
However, in the above scheme, the front-end equipment is required to record the position information of the marking point location at first, although the AR labels are automatically added in batches, the front-end equipment is required to have a corresponding processing device, and the label information of the monitoring point is required to be coupled with the code stream of the camera equipment, so that the access of the real-time mobile point location label cannot be realized.
For this reason, based on the findings of the above-mentioned technical problems, the inventors have proposed the following technical solutions to solve or improve the above-mentioned problems. It should be noted that the above prior art solutions have all the drawbacks that the inventors have obtained after practice and careful study, and thus the discovery process of the above problems and the solutions to the problems that the embodiments of the present application hereinafter propose should not be construed as what the inventors have made in the invention creation process to the present application, but should not be construed as what is known to those skilled in the art.
Fig. 1 shows an application scenario schematic diagram of a mobile tag display method provided in an embodiment of the present application, where in this embodiment, the application scenario may include a terminal device 100, a mobile device 200, and a camera 300. The terminal device 100 may be communicatively connected to a plurality of mobile devices 200 and a camera 300, where the camera 300 may be used to monitor real-time images of a scene in which a plurality of mobile devices 200 typically exist, for example, but not limited to, any electronic device having a positioning function, such as an in-vehicle central control terminal, a law enforcement instrument, a drone, and the like. These mobile devices 200 may be previously accessed to the terminal device 100 to transmit their own device information to the terminal device 100 in real time or every certain period, so that the terminal device 100 adds a corresponding AR tag in the monitoring screen of the camera 300.
Fig. 2 shows a flowchart of a mobile tag display method according to an embodiment of the present application, where the mobile tag display method may be performed by the terminal device 100 shown in fig. 1. It should be understood that, in other embodiments, the order of some steps in the mobile tag display method of the present embodiment may be interchanged according to actual needs, or some steps may be omitted or deleted. The detailed steps of the mobile tag display method are described as follows.
Step S110, obtaining device information of each mobile device 200 accessed in the monitoring scene.
Step S120, an augmented reality AR scene of the target camera is constructed according to the position information of the selected target camera, and a proportional relationship between the AR scene and the actual scene is recorded.
Step S130, determining the AR mapping coordinates of each mobile device 200 in the AR scene according to the device location information of each mobile device 200, the location information of the target camera, and the scaling relationship.
In step S140, according to the AR mapping coordinates of each mobile device 200 in the AR scene, the AR label of each mobile device 200 is displayed in the AR scene.
In this embodiment, the device information of the mobile device 200 may include device location information, for example, but not limited to latitude and longitude information, and correspondingly, the location information of the selected target camera may also be, but not limited to, latitude and longitude information. It should be noted that, the user may select any one of the cameras 300 to be viewed as the target camera through the terminal device 100 according to the actual requirement, which is not limited in this embodiment. For example, a camera 300 located at a high point of a certain monitored scene may be generally selected as a target camera of the monitoring device in order to more fully add an AR tag of the mobile device 200 in a live AR picture of the target camera later.
In this embodiment, the above-mentioned AR scene may include a virtual camera corresponding to the target camera, so that the AR tag of the mobile device 200 may be added in association with the virtual camera later, without depending on the processing of the camera 300, and the tag of the mobile device 200 may not need to be stored in the code stream of the camera 300, thereby improving the real-time performance of the tag display.
Thus, based on the above steps, the mobile tag display method provided in this embodiment constructs an augmented reality AR scene of the target camera according to the position information of the selected target camera, records a proportional relationship between the AR scene and the actual scene, and then determines AR mapping coordinates of each mobile device 200 in the AR scene according to the device position information of each mobile device 200, the position information of the target camera and the proportional relationship, so as to display the AR tag of each mobile device 200 in the AR scene. Therefore, the tags of the mobile device 200 can be automatically accessed and displayed in batches in the AR live pictures of the target camera, and the actual scene is simulated by the AR scene, so that the mobile device is more flexible and portable than the front-end equipment side, does not need to depend on the front-end equipment, is not limited by hardware, does not depend on the camera 300 for processing, and can be stored in the code stream of the camera 300 without coupling with the camera 300, thereby improving the real-time performance of tag display.
In one possible implementation, for step S120, in order to enhance the simulated experience of the augmented reality AR scene of the target camera, an exemplary manner of constructing the augmented reality AR scene of the target camera is described below in conjunction with fig. 3, and referring to fig. 3, step S130 may be implemented by the following substeps:
and step S131, constructing a three-dimensional simulation scene corresponding to the target camera according to a preset three-dimensional drawing protocol according to the position information of the target camera.
And S132, monitoring the working parameter change of the target camera, and controlling the virtual camera to execute the synchronous action corresponding to the target camera in the three-dimensional simulation scene according to the working parameter change so as to construct the augmented reality AR scene of the target camera.
In this embodiment, in the substep S131, after determining the position information of the target camera, the basic scene information of the three-dimensional simulated scene to be constructed may be determined according to the position information of the target camera. On the basis, a three-dimensional simulation scene corresponding to the target camera can be constructed according to a preset three-dimensional drawing protocol, and a virtual camera of the target camera can be included in the three-dimensional model scene.
The preset three-dimensional drawing protocol can be selected according to actual requirements, for example, a WebGL (Web Graphics Library ) protocol can be selected to build a three-dimensional simulation scene, the WebGL protocol allows JavaScript and OpenGL ES 2.0 to be combined together, and by adding one JavaScript binding of OpenGL ES 2.0, hardware 3D accelerated rendering can be provided for HTML5 Canvas, so that a 3D scene model can be displayed by means of a system graphics card.
In the substep S132, considering that the target camera is not always in a working state in the actual monitoring process, operations such as rotation and zooming may be performed, at this time, in order to improve the simulation experience of the augmented reality AR scene of the target camera, the virtual camera may be controlled to perform a synchronization action corresponding to the target camera in the three-dimensional simulation scene according to the working parameter change of the target camera, for example, a control class corresponding to the use form of the actual camera 300 may be simulated, the parameter change of the target camera may be monitored through the control class, and the parameter change may be transmitted to the three-dimensional simulation model, and the parameters may be processed by using the control class constructed in the foregoing manner, so as to control the change of the virtual camera to be synchronized with the actual target camera, so that the constructed augmented reality AR scene is more simulated and is closer to the actual scene.
Alternatively, the above-mentioned operation parameters may be, but are not limited to, a gyroscope parameter, a zoom parameter, a rotation parameter, etc., which is not limited in any way by the present embodiment.
In a possible implementation manner, for step S130, the positional relationship between each mobile device 200 and the virtual camera in the AR scene may be used to characterize where each mobile device 200 is specifically displayed in the AR scene, since the virtual camera has a certain proportional relationship with the target camera, and is presented to the user in the form of a two-dimensional picture in the actual display process, during which the positional relationship between each mobile device 200 and the virtual camera needs to be accurately calculated. Based on this, an exemplary calculation method is given below in detail in connection with fig. 4-6 for step S130. Referring to fig. 4, step S130 may be implemented by the following sub-steps:
in a substep S131, for each mobile device 200, a distance relationship between the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the set planar coordinate system is determined according to the device position information of the mobile device 200 and the position information of the target camera.
Sub-step S132, determining an angular relationship between the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the set plane coordinate system.
Sub-step S133, determining AR mapping coordinates of each mobile device 200 in the AR scene according to the distance relationship, the angle relationship and the scale relationship.
In one possible example, referring to fig. 5 in combination for substep S131, assume that the current real coordinates of mobile device B are (X 2 ,Y 2 0) as the true coordinates of the target camera a is (X 1 ,Y 1 ,Z 1 ) Taking the set plane coordinate system as the coordinate system of the earth ground planeFor example, according to the relationship between two points, the distance relationship L between the real coordinates of the mobile device B and the mapping point of the target camera a on the ground plane may satisfy the following relationship:
L=R×arcos(cos(90-Y 2 )×cos(90-Y 1 )+sin(90-Y 2 )×sin(90-Y 1 )×cos(X 2 -X 1 ))×π/180
in one possible example, for sub-step S132, a sine angle relationship of the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the direction of the first coordinate axis in the set plane coordinate system may be calculated, while a cosine angle relationship of the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the direction of the second coordinate axis in the set plane coordinate system may be calculated. The first coordinate axis and the second coordinate axis are perpendicular to each other, and the sine angle relationship and the cosine angle relationship form an angle relationship between the real coordinate of the mobile device 200 and the mapping coordinate of the target camera in the set plane coordinate system.
For example, referring to fig. 6 in combination, the angular relationship between the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the set plane coordinate system may satisfy the following relationship:
wherein,for the sinusoidal angular relationship of the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the direction of the first coordinate axis in the set planar coordinate system +.>Mapping seats for real coordinates of mobile device 200 and target camerasThe cosine angle relation in the direction of the second coordinate axis in the set plane coordinate system is marked.
On this basis, please refer again to fig. 5, in sub-step S133, an AR distance relation between AR coordinates of the mobile device 200 and AR coordinates of the virtual camera in the AR scene may be determined according to the distance relation and the proportional relation.
For example, knowing the scaling factor is K, the AR distance relationship L between the AR coordinates of the mobile device 200 and the AR coordinates of the virtual camera in the AR scene can be calculated 1 =L/K。
Then, the first AR coordinates of the virtual camera in the AR scene may be determined, and AR mapping point coordinates of the first AR coordinates in the set plane coordinate system may be determined. For example, as shown in fig. 5, it is known in advance that the first AR coordinates of the virtual camera in the AR scene are (a 1 ,B 1 ,C 1 ) Then the first AR coordinate may be determined to be (a 1 ,B 1 ,C 1 ) The AR mapping point coordinates in the AR scene are (a 1 ,0,C 1 )。
Then, the point coordinates (a) may be mapped according to AR 1 ,0,C 1 ) Relationship of angleAR distance relation L 1 A positional relationship between the mobile device 200 and the target camera in the AR scene is determined. Still taking fig. 5 as an example, assume that mobile device 200 is at point B in the virtual scene 2 The coordinates are (A) 2 ,0,C 2 ) It can be obtained that the positional relationship between the mobile device 200 and the target camera in the AR scene satisfies the following relationship:
on the basis of the above, the second AR coordinates of the mobile device 200 in the AR scene may be calculated according to the positional relationship between the mobile device 200 and the target camera in the AR scene, so as to determine the AR mapping coordinates of the mobile device 200 in the AR scene.
Still taking fig. 5 as an example, a second AR coordinate of the mobile device 200 in the AR scene may be obtained as
Thus, in step S140, a tag point may be determined in the AR scene according to the aforementioned second AR coordinates to be used as the AR tag location of the mobile device 200. For example, it may be assigned the coordinatesThe AR tag of the mobile device 200 may thus be displayed in the AR scene.
In a possible implementation manner, after the foregoing step S140, when the operating parameter of the target camera changes, the normal monitoring scene also changes, and correspondingly, in order to enable the AR scene to adaptively adjust the AR tag along with the changed monitoring scene, please further refer to fig. 7, the mobile tag display method provided in the embodiment of the present application may further include the following steps:
And step S150, when the working parameters of the target camera are detected to be changed, the working state of the virtual camera is adjusted according to the changed working parameters.
Step S160 correspondingly adjusts the display position of the AR tag of each mobile device 200 in the AR scene according to the adjusted working state of the virtual camera.
In this embodiment, the working parameters may be the gyroscope parameters, the zoom parameters, the rotation parameters, and the like listed above, and when the working parameters of the target camera are changed, the virtual camera may be adjusted to synchronize with the working state of the target camera, and when the synchronization is completed, the display position of the AR label of each mobile device 200 may be correspondingly adjusted in the AR scene. For example, when the monitoring scene of the target camera is shifted after synchronization is completed, the AR label in the shifted monitoring scene can be correspondingly added in the AR scene, and the AR label which is no longer located in the AR scene can be removed. In this way, the AR tag may be continuously updated to reveal the monitored change condition of the mobile device 200 when the operating parameters of the target camera change.
In a possible implementation manner, after the foregoing step S140, when a mobile device is newly added, in order to facilitate real-time display of the newly added mobile device in an AR scene, referring further to fig. 8, the mobile tag display method provided in the embodiment of the present application may further include the following steps:
Step S170, when detecting the newly added mobile device in the monitored scene of the target camera, adding the display position of the AR label of the newly added mobile device in the AR scene.
In this embodiment, according to the implementation manners of the foregoing steps S110 to S130, the AR mapping coordinates of the newly added mobile device in the AR scene and the AR mapping coordinates of the newly added mobile device in the AR scene may be determined, so that the AR label of the newly added mobile device may be displayed in the AR scene, which is convenient for real-time display of the newly added mobile device in the AR scene.
Based on the same inventive concept, please refer to fig. 9, which is a schematic diagram illustrating functional modules of the mobile tag display device 110 according to an embodiment of the present application, the present embodiment may divide the functional modules of the mobile tag display device 110 according to the above-mentioned method embodiment. For example, each functional module may be divided corresponding to each function, or two or more functions may be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation. For example, in the case of dividing the respective function modules with the respective functions, the mobile tag display device 110 shown in fig. 9 is only one device schematic diagram. The mobile tag display device 110 may include an obtaining module 111, a constructing module 112, a determining module 113, and a tag display module 114, and functions of each functional module of the mobile tag display device 110 will be described in detail below.
An obtaining module 111, configured to obtain device information of each mobile device 200 accessed in the monitoring scene, where the device information includes device location information. It is understood that the obtaining module 111 may be used to perform the step S110 described above, and reference may be made to the details of the implementation of the obtaining module 111 regarding the step S110 described above.
The construction module 112 is configured to construct an augmented reality AR scene of the target camera according to the position information of the selected target camera, and record a proportional relationship between the AR scene and the actual scene, where the AR scene includes a virtual camera corresponding to the target camera. It will be appreciated that the building block 112 may be adapted to perform step S120 described above, and reference may be made to the details of implementation of the building block 112 as described above with respect to step S120.
A determining module 113, configured to determine AR mapping coordinates of each mobile device 200 in the AR scene according to the device location information of each mobile device 200, the location information of the target camera, and the scaling relationship. It is understood that the determining module 113 may be used to perform the above step S130, and reference may be made to the above description of the step S130 for a detailed implementation of the determining module 113.
The tag display module 114 is configured to display an AR tag of each mobile device 200 in an AR scene according to AR mapping coordinates of each mobile device 200 in the AR scene. It will be appreciated that the tag display module 114 may be used to perform step S140 described above, and reference may be made to the details of step S140 regarding the detailed implementation of the tag display module 114.
In one possible implementation, the construction module 112 may construct the augmented reality AR scene of the target camera by:
according to the position information of the target camera, constructing a three-dimensional simulation scene corresponding to the target camera according to a preset three-dimensional drawing protocol;
monitoring the working parameter change of the target camera, and controlling the virtual camera to execute the synchronous action corresponding to the target camera in the three-dimensional simulation scene according to the working parameter change so as to construct the augmented reality AR scene of the target camera.
In one possible implementation, the determination module 113 may determine the AR mapping coordinates of each mobile device 200 in the AR scene by:
for each mobile device 200, determining a distance relation between the real coordinates of the mobile device 200 and the mapping coordinates of the target camera in a set plane coordinate system according to the device position information of the mobile device 200 and the position information of the target camera;
Determining an angular relationship between the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the set planar coordinate system;
the AR mapping coordinates of each mobile device 200 in the AR scene are determined according to the distance relationship, the angle relationship, and the scale relationship.
In one possible implementation, the real coordinates of the mobile device 200 are located in the set plane coordinate system, and the determining module 113 may determine the angular relationship between the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the set plane coordinate system by:
calculating a sine angle relation between the real coordinates of the mobile device 200 and the mapping coordinates of the target camera in the direction of a first coordinate axis in a set plane coordinate system;
calculating the cosine angle relation between the real coordinates of the mobile device 200 and the mapping coordinates of the target camera in the direction of the second coordinate axis in the set plane coordinate system;
the first coordinate axis and the second coordinate axis are perpendicular to each other, and the sine angle relationship and the cosine angle relationship form an angle relationship between the real coordinate of the mobile device 200 and the mapping coordinate of the target camera in the set plane coordinate system.
In one possible implementation, the determination module 113 may determine the AR mapping coordinates of each mobile device 200 in the AR scene by:
Determining an AR distance relation between AR coordinates of the mobile device 200 and AR coordinates of the virtual camera in the AR scene according to the distance relation and the proportional relation;
determining a first AR coordinate of the virtual camera in the AR scene, and determining an AR mapping point coordinate of the first AR coordinate in a set plane coordinate system;
determining a positional relationship between the mobile device 200 and the target camera in the AR scene according to the AR mapping point coordinates, the angular relationship, and the AR distance relationship;
second AR coordinates of the mobile device 200 in the AR scene are calculated from a positional relationship between the mobile device 200 and the target camera in the AR scene to determine AR mapping coordinates of the mobile device 200 in the AR scene.
In a possible implementation manner, the mobile tag display device 110 may further include an adjustment module, where the adjustment module may be configured to, when detecting that the operating parameter of the target camera changes, adjust the operating state of the virtual camera according to the changed operating parameter, and correspondingly adjust the display position of the AR tag of each mobile device 200 in the AR scene according to the adjusted operating state of the virtual camera.
In a possible implementation manner, the mobile tag display apparatus 110 may further include an adding module, where the adding module may be configured to add, when detecting the newly added mobile device in the monitored scene of the target camera, a display position of the AR tag of the newly added mobile device in the AR scene.
Referring to fig. 10, a schematic block diagram of a terminal device 100 for performing the above mobile tag display method according to an embodiment of the present application is shown, based on the same inventive concept, and the terminal device 100 may include a mobile tag display apparatus 110, a machine-readable storage medium 120, and a processor 130.
In this embodiment, the machine-readable storage medium 120 and the processor 130 are both located in the terminal device 100 and are separately provided. However, it should be understood that the machine-readable storage medium 120 may also be separate from the terminal device 100 and accessible by the processor 130 through a bus interface. In the alternative, machine-readable storage medium 120 may be integrated into processor 130, for example, machine-readable storage medium 120 may also be a cache and/or general purpose registers.
The processor 130 is a control center of the terminal device 100, connects various parts of the entire terminal device 100 using various interfaces and lines, and performs various functions of the terminal device 100 and processes data by running or executing software programs and/or modules stored in the machine-readable storage medium 120, and calling data stored in the machine-readable storage medium 120, thereby controlling the terminal device 100 as a whole. Optionally, the processor 130 may include one or more processing cores; for example, processor 130 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
The processor 130 may be a general-purpose central processing unit (Central Processing Unit, CPU), microprocessor, application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the program of the mobile tag display method provided in the above method embodiment.
The machine-readable storage medium 120 may be, but is not limited to, ROM or other type of static storage device, RAM or other type of dynamic storage device, which may store static information and instructions, or Electrically Erasable Programmabler-Only MEMory (EEPROM), compact Read-Only MEMory (CD-ROM) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The machine-readable storage medium 120 may reside separately and be coupled to the processor 130 by a communication bus. The machine-readable storage medium 120 may also be integral to the processor. Wherein the machine-readable storage medium 120 is used to store machine-executable instructions for performing aspects of the present application. The processor 130 is configured to execute machine-executable instructions stored in the machine-readable storage medium 120 to implement the mobile tag display method provided by the foregoing method embodiments.
The mobile tag display apparatus 110 may include various functional modules (e.g., the obtaining module 111, the constructing module 112, the determining module 113, and the tag display module 114) described in fig. 9, for example, and may be stored in the machine-readable storage medium 120 in the form of software program codes, and the processor 130 may implement the mobile tag display method provided by the foregoing method embodiment by executing the various functional modules of the mobile tag display apparatus 110.
Since the terminal device 100 provided in the embodiment of the present application is another implementation form of the method embodiment executed by the terminal device 100, and the terminal device 100 may be used to execute the mobile tag display method provided in the method embodiment, the technical effects that can be obtained by the terminal device 100 may refer to the method embodiment and will not be described herein.
Further, the present application also provides a readable storage medium containing computer executable instructions, which when executed, may be used to implement the mobile tag display method provided in the above method embodiment.
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present application is not limited to the above method operations, and may also perform the related operations in the mobile tag display method provided in any embodiment of the present application.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Although the present application has been described herein in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the figures, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
The foregoing is merely various embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A mobile tag display method, characterized in that it is applied to a terminal device, the method comprising:
obtaining equipment information of each mobile equipment accessed in a monitoring scene, wherein the equipment information comprises equipment position information;
constructing an Augmented Reality (AR) scene of a target camera according to the position information of the selected target camera, and recording a proportional relationship between the AR scene and an actual scene, wherein the AR scene comprises a virtual camera corresponding to the target camera;
determining AR mapping coordinates of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relation;
displaying the AR label of each mobile device in the AR scene according to the AR mapping coordinates of each mobile device in the AR scene;
The step of determining the AR mapping coordinates of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relation comprises the following steps:
for each mobile device, determining a distance relation between the real coordinates of the mobile device and the mapping coordinates of the target camera in a set plane coordinate system according to the device position information of the mobile device and the position information of the target camera; determining an angle relation between the real coordinates of the mobile device and the mapping coordinates of the target camera in a set plane coordinate system; and determining AR mapping coordinates of each mobile device in the AR scene according to the distance relation, the angle relation and the proportional relation.
2. The mobile tag display method according to claim 1, wherein the step of constructing an augmented reality AR scene of the target camera according to the position information of the selected target camera comprises:
according to the position information of the target camera, constructing a three-dimensional simulation scene corresponding to the target camera according to a preset three-dimensional drawing protocol;
Monitoring the working parameter change of the target camera, and controlling the virtual camera to execute synchronous actions corresponding to the target camera in the three-dimensional simulation scene according to the working parameter change so as to construct an Augmented Reality (AR) scene of the target camera.
3. The mobile tag display method according to claim 1, wherein the real coordinates of the mobile device are located in the set plane coordinate system, and the step of determining an angular relationship between the real coordinates of the mobile device and the mapped coordinates of the target camera in the set plane coordinate system includes:
calculating a sine angle relation between the real coordinates of the mobile equipment and the mapping coordinates of the target camera in the direction of a first coordinate axis in the set plane coordinate system;
calculating cosine angle relation between the real coordinates of the mobile equipment and the mapping coordinates of the target camera in the direction of a second coordinate axis in the set plane coordinate system;
the first coordinate axis and the second coordinate axis are perpendicular to each other, and the sine angle relation and the cosine angle relation form an angle relation between a real coordinate of the mobile device and a mapping coordinate of the target camera in a set plane coordinate system.
4. The mobile tag display method according to claim 1, wherein the step of determining AR mapping coordinates of each mobile device in the AR scene according to the distance relation, the angle relation, and the scale relation comprises:
determining an AR distance relation between the AR coordinates of the mobile device and the AR coordinates of the virtual camera in the AR scene according to the distance relation and the proportional relation;
determining a first AR coordinate of the virtual camera in the AR scene, and determining an AR mapping point coordinate of the first AR coordinate in the set plane coordinate system;
determining a position relationship between the mobile device and the target camera in the AR scene according to the AR mapping point coordinates, the angle relationship and the AR distance relationship;
and calculating second AR coordinates of the mobile device in the AR scene according to the position relation between the mobile device and the target camera in the AR scene so as to determine AR mapping coordinates of the mobile device in the AR scene.
5. The mobile tag display method according to any one of claims 1 to 4, characterized in that the method further comprises:
When the working parameters of the target camera are detected to change, the working state of the virtual camera is adjusted according to the changed working parameters;
and correspondingly adjusting the display position of the AR label of each mobile device in the AR scene according to the adjusted working state of the virtual camera.
6. The mobile tag display method according to any one of claims 1 to 4, characterized in that the method further comprises:
and when detecting a new mobile device in the monitored scene of the target camera, adding the display position of the AR label of the new mobile device in the AR scene.
7. A mobile tag display apparatus, characterized by being applied to a terminal device, the apparatus comprising:
the acquisition module is used for acquiring equipment information of each mobile equipment accessed in the monitoring scene, wherein the equipment information comprises equipment position information;
the building module is used for building an Augmented Reality (AR) scene of the target camera according to the position information of the selected target camera and recording the proportional relation between the AR scene and the actual scene, wherein the AR scene comprises a virtual camera corresponding to the target camera;
The determining module is used for determining AR mapping coordinates of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relation;
the tag display module is used for displaying the AR tag of each mobile device in the AR scene according to the AR mapping coordinates of each mobile device in the AR scene;
the determining module is specifically configured to:
for each mobile device, determining a distance relation between the real coordinates of the mobile device and the mapping coordinates of the target camera in a set plane coordinate system according to the device position information of the mobile device and the position information of the target camera; determining an angle relation between the real coordinates of the mobile device and the mapping coordinates of the target camera in a set plane coordinate system; and determining AR mapping coordinates of each mobile device in the AR scene according to the distance relation, the angle relation and the proportional relation.
8. A terminal device comprising a machine-readable storage medium storing machine-executable instructions and a processor which, when executing the machine-executable instructions, implements the mobile label display method of any one of claims 1-6.
9. A readable storage medium having stored therein machine executable instructions that when executed implement the mobile tag display method of any one of claims 1-6.
CN201911150100.1A 2019-11-21 2019-11-21 Mobile tag display method, device, terminal equipment and readable storage medium Active CN112825198B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911150100.1A CN112825198B (en) 2019-11-21 2019-11-21 Mobile tag display method, device, terminal equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911150100.1A CN112825198B (en) 2019-11-21 2019-11-21 Mobile tag display method, device, terminal equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112825198A CN112825198A (en) 2021-05-21
CN112825198B true CN112825198B (en) 2024-04-05

Family

ID=75907335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911150100.1A Active CN112825198B (en) 2019-11-21 2019-11-21 Mobile tag display method, device, terminal equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112825198B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114116110A (en) * 2021-07-20 2022-03-01 上海诺司纬光电仪器有限公司 Intelligent interface based on augmented reality

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109782901A (en) * 2018-12-06 2019-05-21 网易(杭州)网络有限公司 Augmented reality exchange method, device, computer equipment and storage medium
CN109961522A (en) * 2019-04-02 2019-07-02 百度国际科技(深圳)有限公司 Image projecting method, device, equipment and storage medium
CN110097061A (en) * 2019-04-16 2019-08-06 聚好看科技股份有限公司 A kind of image display method and apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9367961B2 (en) * 2013-04-15 2016-06-14 Tencent Technology (Shenzhen) Company Limited Method, device and storage medium for implementing augmented reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109782901A (en) * 2018-12-06 2019-05-21 网易(杭州)网络有限公司 Augmented reality exchange method, device, computer equipment and storage medium
CN109961522A (en) * 2019-04-02 2019-07-02 百度国际科技(深圳)有限公司 Image projecting method, device, equipment and storage medium
CN110097061A (en) * 2019-04-16 2019-08-06 聚好看科技股份有限公司 A kind of image display method and apparatus

Also Published As

Publication number Publication date
CN112825198A (en) 2021-05-21

Similar Documents

Publication Publication Date Title
US10740975B2 (en) Mobile augmented reality system
CA3096601C (en) Presenting image transition sequences between viewing locations
US20210209857A1 (en) Techniques for capturing and displaying partial motion in virtual or augmented reality scenes
Zollmann et al. Augmented reality for construction site monitoring and documentation
US8633970B1 (en) Augmented reality with earth data
US8803992B2 (en) Augmented reality navigation for repeat photography and difference extraction
EP2972723B1 (en) Smooth draping layer for rendering vector data on complex three dimensional objects
CN112396686A (en) Three-dimensional scene engineering simulation and live-action fusion system and method
CN107084740B (en) Navigation method and device
CN111031293B (en) Panoramic monitoring display method, device and system and computer readable storage medium
EP3655928B1 (en) Soft-occlusion for computer graphics rendering
US10733777B2 (en) Annotation generation for an image network
JP6571262B2 (en) Display objects based on multiple models
JP6768123B2 (en) Augmented reality methods and equipment
CN112714266B (en) Method and device for displaying labeling information, electronic equipment and storage medium
Fukuda et al. Improvement of registration accuracy of a handheld augmented reality system for urban landscape simulation
CN110910504A (en) Method and device for determining three-dimensional model of region
CN105931284B (en) Fusion method and device of three-dimensional texture TIN data and large scene data
CN112825198B (en) Mobile tag display method, device, terminal equipment and readable storage medium
CN110543236A (en) Machine room monitoring system and method based on virtual reality technology
KR102314782B1 (en) apparatus and method of displaying three dimensional augmented reality
CN109816791B (en) Method and apparatus for generating information
CN111818265A (en) Interaction method and device based on augmented reality model, electronic equipment and medium
CN109034214B (en) Method and apparatus for generating a mark
Luley et al. Mobile augmented reality for tourists–MARFT

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant