CN112825198A - Mobile label display method and device, terminal equipment and readable storage medium - Google Patents

Mobile label display method and device, terminal equipment and readable storage medium Download PDF

Info

Publication number
CN112825198A
CN112825198A CN201911150100.1A CN201911150100A CN112825198A CN 112825198 A CN112825198 A CN 112825198A CN 201911150100 A CN201911150100 A CN 201911150100A CN 112825198 A CN112825198 A CN 112825198A
Authority
CN
China
Prior art keywords
scene
mobile device
target camera
mobile
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911150100.1A
Other languages
Chinese (zh)
Other versions
CN112825198B (en
Inventor
许红锦
史有华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN201911150100.1A priority Critical patent/CN112825198B/en
Publication of CN112825198A publication Critical patent/CN112825198A/en
Application granted granted Critical
Publication of CN112825198B publication Critical patent/CN112825198B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The embodiment of the application provides a mobile tag display method, a mobile tag display device, a terminal device and a readable storage medium, wherein an Augmented Reality (AR) scene of a target camera is constructed according to selected position information of the target camera, a proportional relation between the AR scene and an actual scene is recorded, and then AR mapping coordinates of each mobile device in the AR scene are determined according to the device position information of each mobile device, the position information of the target camera and the proportional relation, so that the AR tag of each mobile device is displayed in the AR scene. Therefore, the mobile equipment label can be automatically accessed and displayed in the AR live picture of the target camera in batch, the AR scene simulates the actual scene, the mobile equipment label is more flexible and portable than the front-end equipment side, does not need to be dependent on the front-end equipment, is not limited by hardware, does not depend on the camera for processing, and does not need to be stored in the code stream of the camera due to the fact that the mobile equipment label is not coupled with the camera, so that the real-time performance of label display is improved.

Description

Mobile label display method and device, terminal equipment and readable storage medium
Technical Field
The application relates to the technical field of monitoring, in particular to a mobile tag display method, a mobile tag display device, a terminal device and a readable storage medium.
Background
At present, scenes for realizing augmented reality by utilizing three-dimensional modeling to solve real-time video are increasingly applied to the field of video monitoring, wherein the augmented reality means that new information is added to an image in a real-time or near real-time mode by using computer-generated augmentation, and elements are labeled and virtualized in the real world so as to telescopically add and present the labeled and virtualized elements in the real world on a display interface.
With the development of the technology, the technology of superimposing an AR (Augmented Reality) tag on a video of a camera has a wide application prospect in the field of city monitoring, but in the current scheme, the tag is usually only marked for a fixed point location, and the current command and scheduling effect is difficult to be well satisfied.
Disclosure of Invention
In view of the above, an object of the present application is to provide a mobile tag display method, apparatus, terminal device and readable storage medium, which can automatically access and display a mobile device tag in an AR live view of a target camera.
According to an aspect of the present application, there is provided a mobile tag display method applied to a terminal device, the method including:
acquiring equipment information of each mobile equipment accessed in a monitoring scene, wherein the equipment information comprises equipment position information;
constructing an Augmented Reality (AR) scene of the target camera according to the selected position information of the target camera, and recording the proportional relation between the AR scene and an actual scene, wherein the AR scene comprises a virtual camera corresponding to the target camera;
determining an AR mapping coordinate of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relation;
and displaying the AR label of each mobile device in the AR scene according to the AR mapping coordinate of each mobile device in the AR scene.
In a possible implementation, the step of constructing an augmented reality AR scene of the target camera according to the position information of the selected target camera includes:
according to the position information of the target camera, a three-dimensional simulation scene corresponding to the target camera is constructed according to a preset three-dimensional drawing protocol;
monitoring the working parameter change of the target camera, and controlling the virtual camera to execute the synchronous action corresponding to the target camera in the three-dimensional simulation scene according to the working parameter change so as to construct the Augmented Reality (AR) scene of the target camera.
In a possible implementation manner, the step of determining the AR mapping coordinates of each mobile device in the AR scene according to the device location information of each mobile device, the location information of the target camera, and the proportional relationship includes:
for each mobile device, determining a distance relation between real coordinates of the mobile device and mapping coordinates of the target camera in a set plane coordinate system according to device position information of the mobile device and position information of the target camera;
determining an angle relation between real coordinates of the mobile equipment and mapping coordinates of the target camera in a set plane coordinate system;
and determining the AR mapping coordinate of each mobile device in the AR scene according to the distance relation, the angle relation and the proportional relation.
In a possible embodiment, the real coordinates of the mobile device are located in the set plane coordinate system, and the step of determining the angular relationship between the real coordinates of the mobile device and the mapping coordinates of the target camera in the set plane coordinate system includes:
calculating the sine angle relation between the real coordinate of the mobile equipment and the mapping coordinate of the target camera in the direction of a first coordinate axis in the set plane coordinate system;
calculating the cosine angle relation between the real coordinate of the mobile equipment and the mapping coordinate of the target camera in the direction of a second coordinate axis in the set plane coordinate system;
the first coordinate axis and the second coordinate axis are perpendicular to each other, and the sine angle relationship and the cosine angle relationship form an angle relationship between a real coordinate of the mobile device and a mapping coordinate of the target camera in a set plane coordinate system.
In a possible implementation manner, the step of determining AR mapping coordinates of each mobile device in the AR scene according to the distance relationship, the angle relationship, and the proportional relationship includes:
determining the AR distance relationship between the AR coordinates of the mobile equipment in the AR scene and the AR coordinates of the virtual camera in the AR scene according to the distance relationship and the proportional relationship;
determining a first AR coordinate of the virtual camera in the AR scene, and determining an AR mapping point coordinate of the first AR coordinate in the set plane coordinate system;
determining the position relation between the mobile equipment and the target camera in the AR scene according to the coordinate of the AR mapping point, the angle relation and the AR distance relation;
and calculating second AR coordinates of the mobile device in the AR scene according to the position relation between the mobile device and the target camera in the AR scene so as to determine the AR mapping coordinates of the mobile device in the AR scene.
In one possible embodiment, the method further comprises:
when the working parameters of the target camera are detected to be changed, the working state of the virtual camera is adjusted according to the changed working parameters;
and correspondingly adjusting the display position of the AR label of each mobile device in the AR scene according to the adjusted working state of the virtual camera.
In one possible embodiment, the method further comprises:
and when detecting that a mobile device is newly added in the monitoring scene of the target camera, adding the display position of the AR label of the newly added mobile device in the AR scene.
According to another aspect of the present application, there is provided a mobile tag display apparatus applied to a terminal device, the apparatus including:
the system comprises an obtaining module, a monitoring module and a processing module, wherein the obtaining module is used for obtaining equipment information of each mobile equipment accessed in a monitoring scene, and the equipment information comprises equipment position information;
the construction module is used for constructing an Augmented Reality (AR) scene of the target camera according to the selected position information of the target camera and recording the proportional relation between the AR scene and an actual scene, wherein the AR scene comprises a virtual camera corresponding to the target camera;
the determining module is used for determining an AR mapping coordinate of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relation;
and the tag display module is used for displaying the AR tag of each mobile device in the AR scene according to the AR mapping coordinate of each mobile device in the AR scene.
According to another aspect of the present application, a terminal device is provided, where the terminal device includes a machine-readable storage medium and a processor, the machine-readable storage medium stores machine-executable instructions, and the processor, when executing the machine-executable instructions, implements the foregoing mobile tag display method.
According to another aspect of the present application, there is provided a readable storage medium having stored therein machine executable instructions which, when executed, implement the aforementioned mobile tag display method.
Based on any one of the above aspects, the augmented reality AR scene of the target camera is constructed according to the selected position information of the target camera, the proportional relation between the AR scene and the actual scene is recorded, and then the AR mapping coordinates of each mobile device in the AR scene are determined according to the device position information of each mobile device, the position information of the target camera and the proportional relation, so that the AR label of each mobile device is displayed in the AR scene. Therefore, the mobile equipment label can be automatically accessed and displayed in the AR live picture of the target camera in batch, the AR scene simulates the actual scene, the mobile equipment label is more flexible and portable than the front-end equipment side, does not need to be dependent on the front-end equipment, is not limited by hardware, does not depend on the camera for processing, and does not need to be stored in the code stream of the camera due to the fact that the mobile equipment label is not coupled with the camera, so that the real-time performance of label display is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic view illustrating an application scenario of a mobile tag display method provided in an embodiment of the present application;
fig. 2 is a flowchart illustrating one of the mobile tag display methods provided in the embodiments of the present application;
FIG. 3 shows a flow diagram of the substeps of step S120 shown in FIG. 2;
FIG. 4 is a flow diagram illustrating sub-steps of step S130 shown in FIG. 2;
fig. 5 is a schematic diagram illustrating determination of a position relationship between a mobile device and the virtual camera according to an embodiment of the present application;
fig. 6 shows a second schematic diagram for determining a position relationship between a mobile device and the virtual camera provided in the embodiment of the present application;
fig. 7 is a second flowchart of a mobile tag display method provided in the embodiment of the present application;
fig. 8 is a third schematic flowchart illustrating a mobile tag display method according to an embodiment of the present application;
fig. 9 is a schematic diagram illustrating functional modules of a mobile tag display apparatus according to an embodiment of the present application;
fig. 10 shows a schematic block diagram of a structure of a terminal device for implementing the above-mentioned mobile tag display method according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some of the embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
For the technical problems mentioned in the foregoing background art, the automatic tagging algorithm currently applied to the AR live picture mainly converts point location information to be tagged into a PT value coordinate (where P denotes a horizontal angle and T denotes a pitch angle) relative to the image capturing device, calculates a coordinate in a corresponding plane coordinate system through the PT value to display, and superimposes a tag on an image picture of the image capturing device, where the generated tag information of the monitoring point is usually stored in a code stream of the image capturing device, so that the image picture can still present the tag information of the monitoring point when a camera in the image capturing device is restarted.
However, in the above scheme, the position information of the marked point location is often recorded by the front-end device, although the batch automatic addition of the AR tags is realized, the front-end device is required to have a corresponding processing device, and the tag information of the monitoring point needs to be coupled with the code stream of the camera device, so that the access of the real-time moving point location tag cannot be realized.
For this reason, based on the findings of the above technical problems, the inventors propose the following technical solutions to solve or improve the above problems. It should be noted that the above prior art solutions have shortcomings which are the results of practical and careful study of the inventor, therefore, the discovery process of the above problems and the solutions proposed by the embodiments of the present application in the following description should be the contribution of the inventor to the present application in the course of the invention creation process, and should not be understood as technical contents known by those skilled in the art.
Fig. 1 is a schematic diagram illustrating an application scenario of a mobile tag display method according to an embodiment of the present application, where the application scenario may include a terminal device 100, a mobile device 200, and a camera 300. The terminal device 100 may be in communication connection with a plurality of mobile devices 200 and a camera 300, and the camera 300 may be used to monitor a real-time image of a scene where many mobile devices 200 are located, for example, many mobile devices 200 may be present, such as but not limited to any electronic device with a positioning function, such as an in-vehicle central control terminal, a law enforcement instrument, a drone, and the like. These mobile devices 200 may access the terminal device 100 in advance to send their own device information to the terminal device 100 in real time or at every certain period, so that the terminal device 100 adds a corresponding AR tag in the monitoring screen of the camera 300.
Fig. 2 is a flowchart illustrating a mobile tag display method according to an embodiment of the present application, where in this embodiment, the mobile tag display method may be executed by the terminal device 100 shown in fig. 1. It should be understood that, in other embodiments, the order of some steps in the mobile tag display method of this embodiment may be interchanged according to actual needs, or some steps may be omitted or deleted. The detailed steps of the mobile tag display method are described as follows.
Step S110, obtain the device information of each mobile device 200 accessed in the monitoring scene.
And step S120, constructing an Augmented Reality (AR) scene of the target camera according to the selected position information of the target camera, and recording the proportional relation between the AR scene and the actual scene.
Step S130, determining AR mapping coordinates of each mobile device 200 in the AR scene according to the device location information of each mobile device 200, the location information of the target camera, and the proportional relationship.
Step S140, displaying the AR label of each mobile device 200 in the AR scene according to the AR mapping coordinates of each mobile device 200 in the AR scene.
In this embodiment, the device information of the mobile device 200 may include device location information, which may be, but is not limited to, longitude and latitude information, and correspondingly, the location information of the selected target camera may also be, but is not limited to, longitude and latitude information. It should be noted that, the user may select any one of the cameras 300 to be viewed as the target camera through the terminal device 100 according to actual needs, which is not limited in this embodiment. For example, a camera 300 located at a high point in a certain monitored scene may be generally selected as a target camera for the monitoring device, so as to subsequently more fully add the AR tag of the mobile device 200 in the live AR view of the target camera.
In this embodiment, the AR scene may include a virtual camera corresponding to the target camera, so that the AR tag of the mobile device 200 may be added in association with the virtual camera in the following, without processing by the camera 300, and since the tag of the mobile device 200 does not need to be coupled with the camera 300, the tag of the mobile device 200 may not need to be stored in a code stream of the camera 300, thereby improving real-time performance of tag display.
Thus, based on the above steps, the mobile tag display method provided in this embodiment constructs an augmented reality AR scene of the target camera according to the selected position information of the target camera, records a proportional relationship between the AR scene and the actual scene, and then determines the AR mapping coordinates of each mobile device 200 in the AR scene according to the device position information of each mobile device 200, the position information of the target camera, and the proportional relationship, so as to display the AR tag of each mobile device 200 in the AR scene. Therefore, the labels of the mobile devices 200 can be automatically accessed and displayed in batches in the AR live pictures of the target cameras, the AR scenes are used for simulating actual scenes, the labels are more flexible and portable than those of the front-end devices, do not need to be dependent on the front-end devices, are not limited by hardware, do not depend on the processing of the cameras 300, and can be stored in the code streams of the cameras 300 without being coupled with the cameras 300, so that the real-time performance of label display is improved.
In one possible implementation, for step S120, in order to improve the simulation experience of the augmented reality AR scene of the target camera, an exemplary way of constructing the augmented reality AR scene of the target camera is described below with reference to fig. 3, and referring to fig. 3, step S130 may be implemented by the following sub-steps:
and a substep S131, constructing a three-dimensional simulation scene corresponding to the target camera according to the position information of the target camera and a preset three-dimensional drawing protocol.
And a substep S132 of monitoring the working parameter change of the target camera and controlling the virtual camera to execute the synchronous action corresponding to the target camera in the three-dimensional simulation scene according to the working parameter change so as to construct the augmented reality AR scene of the target camera.
In this embodiment, in sub-step S131, after the position information of the target camera is determined, the basic scene information of the three-dimensional simulation scene to be constructed may be determined according to the position information of the target camera. On this basis, a three-dimensional simulation scene corresponding to the target camera can be constructed according to a preset three-dimensional drawing protocol, and the three-dimensional model scene can include a virtual camera of the target camera.
The preset three-dimensional drawing protocol can be selected according to actual requirements, for example, a WebGL (Web Graphics Library) protocol can be selected to build a three-dimensional simulation scene, the WebGL protocol allows JavaScript and OpenGL ES 2.0 to be combined together, hardware 3D accelerated rendering can be provided for HTML5 Canvas by adding one JavaScript binding of OpenGL ES 2.0, and thus a 3D scene model can be displayed by means of a system Graphics card.
In the sub-step S132, considering that the target camera is not always in a working state in the actual monitoring process and may perform operations such as rotation and zooming, in order to improve the simulation experience of the augmented reality AR scene of the target camera, the virtual camera may be controlled to perform a synchronization action corresponding to the target camera in the three-dimensional simulation scene according to the change of the working parameter of the target camera, for example, a corresponding control class may be constructed for the virtual camera in the form of simulating the use of the actual camera 300, the parameter change of the target camera is monitored by the control class and transmitted to the three-dimensional simulation model, and the parameter is processed by using the constructed control class, so that the change of the virtual camera is controlled to be synchronized with the actual target camera, and the augmented reality AR scene constructed in this way is more simulated and closer to the actual scene.
Optionally, the operating parameter may be, but is not limited to, a gyroscope parameter, a zoom parameter, a rotation parameter, and the like, which is not limited in this embodiment.
In one possible implementation, for step S130, the position relationship between each mobile device 200 and the virtual camera in the AR scene may be used to represent at which position each mobile device 200 is specifically displayed in the AR scene, since the virtual camera and the target camera have a certain proportional relationship and are displayed to the user in the form of a two-dimensional picture in the actual display process, in which the position relationship between each mobile device 200 and the virtual camera needs to be accurately calculated. Based on this, an exemplary calculation manner is given below in conjunction with fig. 4-6 to describe step S130 in detail. Referring to fig. 4, step S130 may be implemented by the following sub-steps:
and a substep S131, determining, for each mobile device 200, a distance relationship between the real coordinates of the mobile device 200 and the mapping coordinates of the target camera in the set plane coordinate system according to the device position information of the mobile device 200 and the position information of the target camera.
Sub-step S132, determining an angular relationship between the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the set plane coordinate system.
And a substep S133, determining the AR mapping coordinates of each mobile device 200 in the AR scene according to the distance relationship, the angle relationship and the proportional relationship.
In one possible example, for sub-step S131, please refer to fig. 5 in conjunction with, assume that the current real coordinates of mobile device B are (X)2,Y20), the real coordinate of the target camera A is (X)1,Y1,Z1) Taking the example of the coordinate system where the set plane coordinate system is the ground plane of the earth, the distance relationship L between the real coordinate of the mobile device B and the mapping point of the target camera a on the ground plane can satisfy the following relationship according to the distance relationship between the two points:
L=R×arcos(cos(90-Y2)×cos(90-Y1)+sin(90-Y2)×sin(90-Y1)×cos(X2-X1))×π/180
in one possible example, for sub-step S132, a sine angle relationship between the real coordinates of the mobile device 200 and the mapping coordinates of the target camera in a first coordinate axis direction in the set plane coordinate system may be calculated, and a cosine angle relationship between the real coordinates of the mobile device 200 and the mapping coordinates of the target camera in a second coordinate axis direction in the set plane coordinate system may be calculated. The first coordinate axis and the second coordinate axis are perpendicular to each other, and the sine angle relationship and the cosine angle relationship form an angle relationship between a real coordinate of the mobile device 200 and a mapping coordinate of the target camera in a set plane coordinate system.
For example, referring to fig. 6 in combination, the angular relationship between the real coordinates of the mobile device 200 and the mapping coordinates of the target camera in the set plane coordinate system may satisfy the following relationship:
Figure BDA0002283306290000111
Figure BDA0002283306290000112
wherein the content of the first and second substances,
Figure BDA0002283306290000113
is a sinusoidal angular relationship of the real coordinates of the mobile device 200 to the mapped coordinates of the target camera in the direction of the first coordinate axis in the set planar coordinate system,
Figure BDA0002283306290000114
the cosine angle relationship between the real coordinate of the mobile device 200 and the mapping coordinate of the target camera in the direction of the second coordinate axis in the set plane coordinate system.
On the basis, referring to fig. 5 again, in sub-step S133, the AR distance relationship between the AR coordinates of the mobile device 200 and the AR coordinates of the virtual camera in the AR scene may be determined according to the distance relationship and the proportional relationship.
For example, knowing the aforementioned scaling factor as K, an AR distance relationship L between the AR coordinates of mobile device 200 and the AR coordinates of the virtual camera in the AR scene may be calculated1=L/K。
Then, a first AR coordinate of the virtual camera in the AR scene may be determined, and an AR mapping point coordinate of the first AR coordinate in the set plane coordinate system may be determined. For example, as shown in fig. 5, the first AR coordinate of the virtual camera in the AR scene is known in advance as (a)1,B1,C1) Then the first AR coordinate may be determined to be (a)1,B1,C1) The coordinate of the AR mapping point in the AR scene is (A)1,0,C1)。
Next, the point coordinates (A) can be mapped from the AR1,0,C1) Angle relation of
Figure BDA0002283306290000115
And AR distance relationship L1A positional relationship between the mobile device 200 and the target camera in the AR scene is determined. Still taking FIG. 5 as an example, assume that mobile device 200 is at point B in the virtual scene2The coordinate is (A)2,0,C2) Then, it can be obtained that the positional relationship between the mobile device 200 and the target camera in the AR scene satisfies the following relationship:
Figure BDA0002283306290000121
Figure BDA0002283306290000122
based on the above, the second AR coordinate of the mobile device 200 in the AR scene may be calculated according to the position relationship between the mobile device 200 and the target camera in the AR scene to determine the AR mapping coordinate of the mobile device 200 in the AR scene.
Still taking FIG. 5 as an example, the second AR coordinate of the mobile device 200 in the AR scene can be obtained as
Figure BDA0002283306290000123
Thus, in step S140, a tag point can be determined in the AR scene as the AR tag position of the mobile device 200 according to the second AR coordinate. For example, the coordinates may be assigned to
Figure BDA0002283306290000124
This allows the display of the AR tag of the mobile device 200 in an AR scene.
In a possible implementation manner, after the step S140, when the operating parameter of the target camera changes, the monitoring scene also changes, and correspondingly, in order to enable the AR scene to adaptively adjust the AR tag along with the changed monitoring scene, please further refer to fig. 7, the mobile tag display method provided in the embodiment of the present application may further include the following steps:
and S150, when the working parameters of the target camera are detected to be changed, adjusting the working state of the virtual camera according to the changed working parameters.
Step S160, correspondingly adjusting the display position of the AR tag of each mobile device 200 in the AR scene according to the adjusted working state of the virtual camera.
In this embodiment, the operating parameters may be the gyroscope parameters, the zoom parameters, the rotation parameters, and the like listed above, when the operating parameters of the target camera change, the virtual camera may be adjusted to synchronize with the operating state of the target camera, and after synchronization is completed, the display position of the AR tag of each mobile device 200 may be adjusted in the AR scene. For example, when the monitoring scene of the target camera is shifted after synchronization is completed, the AR tag in the shifted monitoring scene may be correspondingly added in the AR scene, and the AR tag no longer located in the AR scene may be removed. In this way, the AR tag may be continuously updated to reveal the monitoring changes of the mobile device 200 when the operating parameters of the target camera change.
In a possible implementation manner, after the foregoing step S140, when a new mobile device is added, please further refer to fig. 8 in order to facilitate real-time display of the new mobile device in an AR scene, the method for displaying a mobile tag provided in this embodiment may further include the following steps:
step S170, when detecting that a mobile device is newly added to the monitoring scene of the target camera, adding the display position of the AR tag of the newly added mobile device to the AR scene.
In this embodiment, the new mobile device in the AR scene and the AR mapping coordinates in the AR scene may be determined according to the implementation manners of the foregoing steps S110 to S130, so that the AR tag of the new mobile device may be displayed in the AR scene, and the new mobile device is conveniently displayed in the AR scene in real time.
Based on the same inventive concept, please refer to fig. 9, which shows a schematic diagram of functional modules of the mobile tag display apparatus 110 according to the embodiment of the present application, and the embodiment can divide the functional modules of the mobile tag display apparatus 110 according to the above method embodiment. For example, the functional blocks may be divided for the respective functions, or two or more functions may be integrated into one processing block. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation. For example, in the case of dividing each function module according to each function, the mobile tag display device 110 shown in fig. 9 is only a device diagram. The mobile tag display apparatus 110 may include an obtaining module 111, a constructing module 112, a determining module 113, and a tag display module 114, and the functions of the functional modules of the mobile tag display apparatus 110 are described in detail below.
An obtaining module 111, configured to obtain device information of each mobile device 200 accessed in the monitoring scenario, where the device information includes device location information. It is understood that the obtaining module 111 can be used to execute the step S110, and for the detailed implementation of the obtaining module 111, reference can be made to the contents related to the step S110.
And a constructing module 112, configured to construct an augmented reality AR scene of the target camera according to the selected position information of the target camera, and record a proportional relationship between the AR scene and the actual scene, where the AR scene includes a virtual camera corresponding to the target camera. It is understood that the building module 112 can be used to perform the step S120, and for the detailed implementation of the building module 112, reference can be made to the above description regarding the step S120.
A determining module 113, configured to determine, according to the device location information of each mobile device 200, the location information of the target camera, and the proportional relationship, AR mapping coordinates of each mobile device 200 in the AR scene. It is understood that the determining module 113 may be configured to perform the step S130, and for the detailed implementation of the determining module 113, reference may be made to the content related to the step S130.
And a tag display module 114, configured to display the AR tag of each mobile device 200 in the AR scene according to the AR mapping coordinates of each mobile device 200 in the AR scene. It is understood that the tag display module 114 can be used to execute the above step S140, and for the detailed implementation of the tag display module 114, reference can be made to the above description regarding the step S140.
In one possible implementation, the construction module 112 may construct the augmented reality AR scene of the target camera by:
according to the position information of the target camera, a three-dimensional simulation scene corresponding to the target camera is constructed according to a preset three-dimensional drawing protocol;
monitoring the working parameter change of the target camera, and controlling the virtual camera to execute the synchronous action corresponding to the target camera in the three-dimensional simulation scene according to the working parameter change so as to construct the augmented reality AR scene of the target camera.
In one possible implementation, the determination module 113 may determine the AR mapping coordinates of each mobile device 200 in the AR scene by:
for each mobile device 200, determining the distance relationship between the real coordinates of the mobile device 200 and the mapping coordinates of the target camera in the set plane coordinate system according to the device position information of the mobile device 200 and the position information of the target camera;
determining an angular relationship between real coordinates of the mobile device 200 and mapping coordinates of the target camera in a set plane coordinate system;
and determining the AR mapping coordinates of each mobile device 200 in the AR scene according to the distance relation, the angle relation and the proportional relation.
In one possible implementation, the real coordinates of the mobile device 200 are located in the set plane coordinate system, and the determining module 113 may determine the angular relationship between the real coordinates of the mobile device 200 and the mapped coordinates of the target camera in the set plane coordinate system by:
calculating the sine angle relationship between the real coordinate of the mobile device 200 and the mapping coordinate of the target camera in the direction of the first coordinate axis in the set plane coordinate system;
calculating the cosine angle relationship between the real coordinate of the mobile device 200 and the mapping coordinate of the target camera in the direction of a second coordinate axis in the set plane coordinate system;
the first coordinate axis and the second coordinate axis are perpendicular to each other, and the sine angle relationship and the cosine angle relationship form an angle relationship between a real coordinate of the mobile device 200 and a mapping coordinate of the target camera in a set plane coordinate system.
In one possible implementation, the determination module 113 may determine the AR mapping coordinates of each mobile device 200 in the AR scene by:
determining the AR distance relationship between the AR coordinates of the mobile device 200 in the AR scene and the AR coordinates of the virtual camera in the AR scene according to the distance relationship and the proportional relationship;
determining a first AR coordinate of the virtual camera in an AR scene, and determining an AR mapping point coordinate of the first AR coordinate in a set plane coordinate system;
determining a position relationship between the mobile device 200 and the target camera in the AR scene according to the AR mapping point coordinates, the angle relationship, and the AR distance relationship;
second AR coordinates of the mobile device 200 in the AR scene are calculated from a positional relationship between the mobile device 200 and the target camera in the AR scene to determine AR mapping coordinates of the mobile device 200 in the AR scene.
In a possible implementation, the mobile tag display apparatus 110 may further include an adjusting module, and the adjusting module may be configured to, when detecting that the operating parameter of the target camera changes, adjust the operating state of the virtual camera according to the changed operating parameter, and correspondingly adjust the display position of the AR tag of each mobile device 200 in the AR scene according to the adjusted operating state of the virtual camera.
In a possible implementation manner, the mobile tag display apparatus 110 may further include an adding module, and the adding module may be configured to, when a new mobile device is detected in the monitoring scene of the target camera, add a display position of an AR tag of the new mobile device in the AR scene.
Based on the same inventive concept, please refer to fig. 10, which shows a schematic block diagram of a terminal device 100 for executing the above mobile tag display method according to an embodiment of the present application, where the terminal device 100 may include a mobile tag display apparatus 110, a machine-readable storage medium 120, and a processor 130.
In this embodiment, the machine-readable storage medium 120 and the processor 130 are both located in the terminal device 100 and are separately located. However, it should be understood that the machine-readable storage medium 120 may be separate from the terminal device 100 and may be accessed by the processor 130 through a bus interface. Alternatively, the machine-readable storage medium 120 may be integrated into the processor 130, for example, the machine-readable storage medium 120 may also be a cache and/or general registers.
The processor 130 is a control center of the terminal device 100, connects various parts of the entire terminal device 100 using various interfaces and lines, performs various functions of the terminal device 100 and processes data by running or executing software programs and/or modules stored in the machine-readable storage medium 120 and calling data stored in the machine-readable storage medium 120, thereby performing overall control of the terminal device 100. Alternatively, processor 130 may include one or more processing cores; for example, the processor 130 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
The processor 130 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an Application-Specific Integrated Circuit (ASIC), or one or more Integrated circuits for controlling the execution of the program of the mobile tag display method provided by the above-mentioned method embodiments.
The machine-readable storage medium 120 may be, but is not limited to, a ROM or other type of static storage device that can store static information and instructions, a RAM or other type of dynamic storage device that can store information and instructions, an Electrically Erasable programmable Read-Only MEMory (EEPROM), a compact disc Read-Only MEMory (CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The machine-readable storage medium 120 may be self-contained and coupled to the processor 130 via a communication bus. The machine-readable storage medium 120 may also be integrated with the processor. The machine-readable storage medium 120 is used for storing machine-executable instructions for performing aspects of the present application. The processor 130 is configured to execute machine executable instructions stored in the machine readable storage medium 120 to implement the mobile tag display method provided by the foregoing method embodiment.
The mobile tag display apparatus 110 may include, for example, the various functional modules (e.g., the obtaining module 111, the constructing module 112, the determining module 113, and the tag display module 114) described in fig. 9, and may be stored in the machine-readable storage medium 120 in the form of software program codes, and the processor 130 may implement the mobile tag display method provided by the foregoing method embodiment by executing the various functional modules of the mobile tag display apparatus 110.
Since the terminal device 100 provided in the embodiment of the present application is another implementation form of the method embodiment executed by the terminal device 100, and the terminal device 100 may be configured to execute the mobile tag display method provided in the method embodiment, the technical effect obtained by the terminal device may refer to the method embodiment, and is not described herein again.
Further, the present application also provides a readable storage medium containing computer executable instructions, and the computer executable instructions can be used for implementing the mobile tag display method provided by the above method embodiments when executed.
Of course, the storage medium provided in the embodiments of the present application contains computer-executable instructions, and the computer-executable instructions are not limited to the above method operations, and may also perform related operations in the mobile tag display method provided in any embodiments of the present application.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the present application has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
The above description is only for various embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and all such changes or substitutions are included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A mobile tag display method is applied to terminal equipment, and the method comprises the following steps:
acquiring equipment information of each mobile equipment accessed in a monitoring scene, wherein the equipment information comprises equipment position information;
constructing an Augmented Reality (AR) scene of the target camera according to the selected position information of the target camera, and recording the proportional relation between the AR scene and an actual scene, wherein the AR scene comprises a virtual camera corresponding to the target camera;
determining an AR mapping coordinate of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relation;
and displaying the AR label of each mobile device in the AR scene according to the AR mapping coordinate of each mobile device in the AR scene.
2. The method according to claim 1, wherein the step of constructing the Augmented Reality (AR) scene of the target camera according to the selected position information of the target camera comprises:
according to the position information of the target camera, a three-dimensional simulation scene corresponding to the target camera is constructed according to a preset three-dimensional drawing protocol;
monitoring the working parameter change of the target camera, and controlling the virtual camera to execute the synchronous action corresponding to the target camera in the three-dimensional simulation scene according to the working parameter change so as to construct the Augmented Reality (AR) scene of the target camera.
3. The method according to claim 1, wherein the step of determining the AR mapping coordinates of each mobile device in the AR scene according to the device location information of each mobile device, the location information of the target camera, and the proportional relationship comprises:
for each mobile device, determining a distance relation between real coordinates of the mobile device and mapping coordinates of the target camera in a set plane coordinate system according to device position information of the mobile device and position information of the target camera;
determining an angle relation between real coordinates of the mobile equipment and mapping coordinates of the target camera in a set plane coordinate system;
and determining the AR mapping coordinate of each mobile device in the AR scene according to the distance relation, the angle relation and the proportional relation.
4. The mobile tag display method of claim 3, wherein the real coordinates of the mobile device are located in the set planar coordinate system, and the step of determining the angular relationship between the real coordinates of the mobile device and the mapping coordinates of the target camera in the set planar coordinate system comprises:
calculating the sine angle relation between the real coordinate of the mobile equipment and the mapping coordinate of the target camera in the direction of a first coordinate axis in the set plane coordinate system;
calculating the cosine angle relation between the real coordinate of the mobile equipment and the mapping coordinate of the target camera in the direction of a second coordinate axis in the set plane coordinate system;
the first coordinate axis and the second coordinate axis are perpendicular to each other, and the sine angle relationship and the cosine angle relationship form an angle relationship between a real coordinate of the mobile device and a mapping coordinate of the target camera in a set plane coordinate system.
5. The method according to claim 3, wherein the step of determining the AR mapping coordinates of each mobile device in the AR scene according to the distance relationship, the angle relationship and the proportional relationship comprises:
determining the AR distance relationship between the AR coordinates of the mobile equipment in the AR scene and the AR coordinates of the virtual camera in the AR scene according to the distance relationship and the proportional relationship;
determining a first AR coordinate of the virtual camera in the AR scene, and determining an AR mapping point coordinate of the first AR coordinate in the set plane coordinate system;
determining the position relation between the mobile equipment and the target camera in the AR scene according to the coordinate of the AR mapping point, the angle relation and the AR distance relation;
and calculating second AR coordinates of the mobile device in the AR scene according to the position relation between the mobile device and the target camera in the AR scene so as to determine the AR mapping coordinates of the mobile device in the AR scene.
6. The mobile tag display method according to any one of claims 1 to 5, further comprising:
when the working parameters of the target camera are detected to be changed, the working state of the virtual camera is adjusted according to the changed working parameters;
and correspondingly adjusting the display position of the AR label of each mobile device in the AR scene according to the adjusted working state of the virtual camera.
7. The mobile tag display method according to any one of claims 1 to 5, further comprising:
and when detecting that a mobile device is newly added in the monitoring scene of the target camera, adding the display position of the AR label of the newly added mobile device in the AR scene.
8. A mobile tag display device is applied to a terminal device, and the device comprises:
the system comprises an obtaining module, a monitoring module and a processing module, wherein the obtaining module is used for obtaining equipment information of each mobile equipment accessed in a monitoring scene, and the equipment information comprises equipment position information;
the construction module is used for constructing an Augmented Reality (AR) scene of the target camera according to the selected position information of the target camera and recording the proportional relation between the AR scene and an actual scene, wherein the AR scene comprises a virtual camera corresponding to the target camera;
the determining module is used for determining an AR mapping coordinate of each mobile device in the AR scene according to the device position information of each mobile device, the position information of the target camera and the proportional relation;
and the tag display module is used for displaying the AR tag of each mobile device in the AR scene according to the AR mapping coordinate of each mobile device in the AR scene.
9. A terminal device, comprising a machine-readable storage medium having stored thereon machine-executable instructions and a processor, wherein the processor, when executing the machine-executable instructions, implements the mobile tag display method of any one of claims 1 to 7.
10. A readable storage medium having stored therein machine executable instructions which when executed perform the mobile tag display method of any one of claims 1 to 7.
CN201911150100.1A 2019-11-21 2019-11-21 Mobile tag display method, device, terminal equipment and readable storage medium Active CN112825198B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911150100.1A CN112825198B (en) 2019-11-21 2019-11-21 Mobile tag display method, device, terminal equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911150100.1A CN112825198B (en) 2019-11-21 2019-11-21 Mobile tag display method, device, terminal equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112825198A true CN112825198A (en) 2021-05-21
CN112825198B CN112825198B (en) 2024-04-05

Family

ID=75907335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911150100.1A Active CN112825198B (en) 2019-11-21 2019-11-21 Mobile tag display method, device, terminal equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112825198B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114116110A (en) * 2021-07-20 2022-03-01 上海诺司纬光电仪器有限公司 Intelligent interface based on augmented reality

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140306996A1 (en) * 2013-04-15 2014-10-16 Tencent Technology (Shenzhen) Company Limited Method, device and storage medium for implementing augmented reality
CN109782901A (en) * 2018-12-06 2019-05-21 网易(杭州)网络有限公司 Augmented reality exchange method, device, computer equipment and storage medium
CN109961522A (en) * 2019-04-02 2019-07-02 百度国际科技(深圳)有限公司 Image projecting method, device, equipment and storage medium
CN110097061A (en) * 2019-04-16 2019-08-06 聚好看科技股份有限公司 A kind of image display method and apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140306996A1 (en) * 2013-04-15 2014-10-16 Tencent Technology (Shenzhen) Company Limited Method, device and storage medium for implementing augmented reality
CN109782901A (en) * 2018-12-06 2019-05-21 网易(杭州)网络有限公司 Augmented reality exchange method, device, computer equipment and storage medium
CN109961522A (en) * 2019-04-02 2019-07-02 百度国际科技(深圳)有限公司 Image projecting method, device, equipment and storage medium
CN110097061A (en) * 2019-04-16 2019-08-06 聚好看科技股份有限公司 A kind of image display method and apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114116110A (en) * 2021-07-20 2022-03-01 上海诺司纬光电仪器有限公司 Intelligent interface based on augmented reality

Also Published As

Publication number Publication date
CN112825198B (en) 2024-04-05

Similar Documents

Publication Publication Date Title
US11217019B2 (en) Presenting image transition sequences between viewing locations
US11854149B2 (en) Techniques for capturing and displaying partial motion in virtual or augmented reality scenes
KR20220030263A (en) texture mesh building
WO2017092303A1 (en) Virtual reality scenario model establishing method and device
EP3534336B1 (en) Panoramic image generating method and apparatus
CN107084740B (en) Navigation method and device
CN112396686A (en) Three-dimensional scene engineering simulation and live-action fusion system and method
CN110794955B (en) Positioning tracking method, device, terminal equipment and computer readable storage medium
US10733777B2 (en) Annotation generation for an image network
US20140267236A1 (en) System and Method for Approximating Cartographic Projections by Linear Transformation
TWI783472B (en) Ar scene content generation method, display method, electronic equipment and computer readable storage medium
CN112714266B (en) Method and device for displaying labeling information, electronic equipment and storage medium
US20210118236A1 (en) Method and apparatus for presenting augmented reality data, device and storage medium
Kolivand et al. Cultural heritage in marker-less augmented reality: A survey
Fukuda et al. Improvement of registration accuracy of a handheld augmented reality system for urban landscape simulation
CN108958462A (en) A kind of methods of exhibiting and device of virtual objects
CN110910504A (en) Method and device for determining three-dimensional model of region
CN111583398A (en) Image display method and device, electronic equipment and computer readable storage medium
CN110543236A (en) Machine room monitoring system and method based on virtual reality technology
CN112825198B (en) Mobile tag display method, device, terminal equipment and readable storage medium
EP3007136B1 (en) Apparatus and method for generating an augmented reality representation of an acquired image
CN111818265B (en) Interaction method and device based on augmented reality model, electronic equipment and medium
KR102199772B1 (en) Method for providing 3D modeling data
CN109816791B (en) Method and apparatus for generating information
Luley et al. Mobile augmented reality for tourists–MARFT

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant