CN112445995A - Scene fusion display method and device under WebGL - Google Patents

Scene fusion display method and device under WebGL Download PDF

Info

Publication number
CN112445995A
CN112445995A CN202011378932.1A CN202011378932A CN112445995A CN 112445995 A CN112445995 A CN 112445995A CN 202011378932 A CN202011378932 A CN 202011378932A CN 112445995 A CN112445995 A CN 112445995A
Authority
CN
China
Prior art keywords
scene
webgl
dimensional map
dimensional
webpage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011378932.1A
Other languages
Chinese (zh)
Other versions
CN112445995B (en
Inventor
桑新柱
邢树军
郑玮泽
张泷
沈圣
刘昊
胡松磊
刘彤彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Posts and Telecommunications
Original Assignee
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Posts and Telecommunications filed Critical Beijing University of Posts and Telecommunications
Priority to CN202011378932.1A priority Critical patent/CN112445995B/en
Publication of CN112445995A publication Critical patent/CN112445995A/en
Application granted granted Critical
Publication of CN112445995B publication Critical patent/CN112445995B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Remote Sensing (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Instructional Devices (AREA)

Abstract

The invention provides a scene fusion display method and a device under WebGL, wherein the method comprises the following steps: constructing a small scene in Unity, and fusing monitoring videos; deriving a WebGL file containing monitoring video information through Unity, embedding the WebGL file into a webpage, acquiring two-dimensional map information around a small scene, and embedding the two-dimensional map information into the webpage; and establishing communication with the small scene, and performing related display on the small scene, the two-dimensional map and the three-dimensional earth in a webpage. The method not only has real-time monitoring video fusion for a macroscopic three-dimensional earth scene, adds the fusion of a small scene to the real-time monitoring video, but also combines a two-dimensional map, and can more comprehensively know the environment condition. Through linkage of the two three-dimensional maps, an effective and timely coping scheme can be made for emergency situations in a monitored area according to the two-dimensional map.

Description

Scene fusion display method and device under WebGL
Technical Field
The invention relates to the technical field of image information processing, in particular to a scene fusion display method and device under WebGL.
Background
In recent years, the concept of smart cities becomes clearer due to the rapid development of the internet of things and sensors. The traditional standard GIS application usually has no timeline and is essentially information of a certain time node, so that the real-time performance of three-dimensional scene interaction is seriously influenced. The advent of WebGL (Web Graphics Library), a 3D Graphics mapping protocol, successfully solved this problem. On the basis of the conventional static map, the WebGL enhances the real-time performance of data and enhances the reality of scenes. With the continuous popularization of video monitoring network systems, the video monitoring can be built more quickly in the future, and a camera market presents a huge increase space.
However, due to the influence of factors such as the height, angle and resolution of each camera, the visual area of an actual single camera is still very limited, and although the visual area can be enlarged by increasing the number of the cameras, the monitored target is not easy to observe due to different specific scenes, irregular distribution of the cameras and messy information, so that the monitoring efficiency and the observation experience are greatly influenced, and a macroscopic concept is difficult to be provided for the behavior track and the surrounding environment of the monitored target. The monitoring video is fused with the macro scene and the small scene, so that the movement of the monitored target can be clearly observed, the surrounding environment of the observed person can be conveniently observed, the position of the monitored object can be judged through the small scene, the position of an emergency can be rapidly observed through the macro scene, and the most effective and appropriate coping scheme is provided. Therefore, the multi-scene monitoring environment is particularly important in GIS application.
In the existing fusion technology research of the real-scene three-dimensional scene based on WebGL and the video monitoring image, firstly, a universal network streaming media protocol rtsp transmitted by a camera is transcoded by an open-source player VLC to be transcoded into OGG format data which can be directly played, and then an Nginx agent is used, so that the processed data can be directly played on HTML. And the real-time monitoring video can be fused into a three-dimensional scene by combining a development interface of a WebGL three-dimensional engine Cesium.
The current technology only performs fusion of the real-time monitoring video to a macroscopic three-dimensional earth scene, and cannot judge the surrounding environment of the real-time monitoring video more accurately. And the existing three-dimensional earth scene information is not comprehensive enough to deal with emergent emergencies.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a scene fusion display method and device under WebGL.
The invention provides a scene fusion display method under WebGL, which comprises the following steps: constructing a small scene in Unity, and fusing monitoring videos; deriving a WebGL file containing monitoring video information through Unity, embedding the WebGL file into a webpage, acquiring two-dimensional map information around a small scene, and embedding the two-dimensional map information into the webpage; and establishing communication with the small scene, and performing related display on the small scene, the two-dimensional map and the three-dimensional earth in a webpage.
According to the scene fusion display method under WebGL of one embodiment of the invention, the establishment of communication with small scenes comprises the following steps: and communicating with the small and medium-sized Unity scenes by a SendMessage method.
According to the scene fusion display method under WebGL, after two-dimensional map information around a small scene is obtained, an automatic roaming rule is established according to a two-dimensional map; correspondingly, after the associated display is performed in the webpage, the method further comprises the following steps: and displaying the scene according to the roaming rule.
According to the scene fusion display method under WebGL, after the association display is carried out in the webpage, the method further comprises the following steps: and receiving a manual roaming rule planned according to the two-dimensional map, and displaying the scene according to the manual roaming rule.
According to the scene fusion display method under WebGL, the method for performing association display on a small scene, a two-dimensional map and a three-dimensional earth in a webpage comprises the following steps: determining the longitude and latitude of a central point according to the longitude and latitude of the three-dimensional earth visible area; and synchronizing the longitude and latitude of the central point to the longitude and latitude of the central point of the two-dimensional map.
According to the scene fusion display method under WebGL of one embodiment of the invention, the synchronization of the longitude and latitude of the central point to the longitude and latitude of the central point of the two-dimensional map comprises the following steps: synchronizing the longitude and the latitude of the central point of the two-dimensional map to the longitude and the latitude of the central point of the two-dimensional map by a view.
According to the scene fusion display method under WebGL, the method for performing association display on a small scene, a two-dimensional map and a three-dimensional earth in a webpage comprises the following steps: determining a maximum longitude and latitude point and a minimum longitude and latitude point of a visible area in a three-dimensional earth scene; and obtaining the distance between the two points, comparing the obtained distance with the distance of each scale, and determining the zoom level as the zoom level of the two-dimensional map when the distance of the scale is larger than the distance between the two points.
The invention also provides a scene fusion display device under WebGL, which comprises: the fusion module is used for constructing a small scene in Unity and carrying out monitoring video fusion; the embedded module is used for exporting a WebGL file containing monitoring video information through Unity, embedding the WebGL file into a webpage, acquiring two-dimensional map information around a small scene and embedding the two-dimensional map information into the webpage; and the display module is used for establishing communication with the small scene and carrying out related display on the small scene, the two-dimensional map and the three-dimensional earth in a webpage.
The invention also provides an electronic device, which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the program to realize the steps of the scene fusion display method under the WebGL.
The present invention also provides a non-transitory computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the scene fusion display method under WebGL as described in any one of the above.
The scene fusion display method and device under WebGL provided by the invention not only have real-time monitoring video fusion for a macroscopic three-dimensional earth scene, add the fusion of a small scene to the real-time monitoring video, but also combine a two-dimensional map, so that the environment condition can be more comprehensively known. Through linkage of the two three-dimensional maps, an effective and timely coping scheme can be made for emergency situations in a monitored area according to the two-dimensional map.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a scene fusion display method under WebGL provided by the present invention;
FIG. 2 is a flow chart of linkage of visual areas of a scene fusion display method under WebGL provided by the present invention;
FIG. 3 is a schematic structural diagram of a scene fusion display device under WebGL provided by the present invention;
fig. 4 is a schematic structural diagram of an electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The following describes a scene fusion display method and device under WebGL in the present invention with reference to fig. 1 to 4. Fig. 1 is a schematic flow diagram of a scene fusion display method under WebGL provided by the present invention, and as shown in fig. 1, the scene fusion display method under WebGL provided by the present invention includes:
101. and constructing a small scene in Unity, and fusing the monitoring videos.
WebGL is a 3D drawing protocol, and a Web developer can display 3D scenes and models in a browser more smoothly by means of a system display card. The invention adds a part of a small scene in WebGL, and can more comprehensively know the environment condition. And after a small scene is constructed in the Unity, the monitoring videos in the small scene are fused. A small scene may be a scene that a single surveillance camera can acquire.
102. And exporting a WebGL file containing monitoring video information through Unity, embedding the WebGL file into a webpage, acquiring two-dimensional map information around a small scene, and embedding the two-dimensional map information into the webpage.
And exporting the constructed WebGL file through Unity, wherein the video and the position information of the scene are fused in the WebGL file. And embedding the WebGL file into a WebGL webpage. Meanwhile, a two-dimensional map around the small scene is obtained and embedded into a webpage for subsequent display.
103. And establishing communication with the small scene, and performing related display on the small scene, the two-dimensional map and the three-dimensional earth in a webpage.
By communicating with unity in HTML, the method written in unity can be called for use at the HTML end. The small scene, the two-dimensional map and the three-dimensional earth are displayed in a webpage in a correlated mode, information around a monitored environment is more comprehensive, functions such as route planning can be achieved on the two-dimensional map, and a user can be helped to make a more rapid and effective coping scheme for coping with monitored emergencies.
The scene fusion display method under WebGL not only has real-time monitoring video fusion for a macroscopic three-dimensional earth scene, but also adds the fusion of a small scene to the real-time monitoring video, and combines a two-dimensional map, so that the environment condition can be known more comprehensively. Through linkage of the two three-dimensional maps, an effective and timely coping scheme can be made for emergency situations in a monitored area according to the two-dimensional map.
In one embodiment, the establishing communication with the small scene includes: and communicating with the small and medium-sized Unity scenes by a SendMessage method.
The method can communicate with the Unity through the SendMessage method in the HTML, and calls the method written in the Unity to be used at the HTML end.
In one embodiment, after the two-dimensional map information around the small scene is acquired, an automatic roaming rule is established according to the two-dimensional map; correspondingly, after the associated display is performed in the webpage, the method further comprises the following steps: and displaying the scene according to the roaming rule.
For example, the terminal message can communicate with the Unity small scene, and then the roaming display is performed based on the automatic roaming path planned by the preset two-dimensional map.
In one embodiment, after the associated presentation in the webpage, the method further comprises: and receiving a manual roaming rule planned according to the two-dimensional map, and displaying the scene according to the manual roaming rule. The path can be manually planned according to the two-dimensional map for displaying.
In one embodiment, the associating and showing the small scene, the two-dimensional map and the three-dimensional earth in the webpage comprises: determining the longitude and latitude of a central point according to the longitude and latitude of the three-dimensional earth visible area; and synchronizing the longitude and latitude of the central point to the longitude and latitude of the central point of the two-dimensional map.
Specifically, the synchronizing the longitude and latitude of the central point to the longitude and latitude of the central point of the two-dimensional map includes: synchronizing the longitude and the latitude of the central point of the two-dimensional map to the longitude and the latitude of the central point of the two-dimensional map by a view.
After communication with the small scene is established, the two-dimensional map and the three-dimensional earth are further synchronized, and central point linkage is realized. The longitude and latitude of the boundary of the visible area can be easily obtained by calling the method in the cecum, so that the longitude and latitude of the central point are determined, the obtained longitude and latitude of the central point are transmitted to the two-dimensional map openlayer by using a view.
In one embodiment, the association presentation of the small scene, the two-dimensional map and the three-dimensional earth in the webpage comprises the following steps: determining a maximum longitude and latitude point and a minimum longitude and latitude point of a visible area in a three-dimensional earth scene; and obtaining the distance between the two points, comparing the obtained distance with the distance of the scale, and determining the zoom level as the zoom level of the two-dimensional map when the distance of the scale is larger than the distance between the two points.
In order to realize linkage of the visual areas, the camera height in the three-dimensional earth scene needs to be set to be consistent with the zoom level in the two-dimensional map to ensure that the two three-dimensional maps are in the same visual area range. Firstly, determining an upper right corner (a maximum longitude and latitude point) and a lower left corner (a minimum longitude and latitude point) of a visible area of a three-dimensional earth scene, obtaining a distance between the two points through wgs84sphere. Fig. 2 is a flow chart of the linkage of the visual area of the scene fusion display method under WebGL of the present invention, and the flow chart is as shown in fig. 2, and tests show that the corresponding relationship can achieve good linkage effect during zooming, and the specific comparison relationship can be shown in table 1.
TABLE 1
Zoom rating Distance of scale
1 140
2 270
3 500
4 670
5 850
6 1500
7 3350
8 4500
9 10970
10 33030
11 155280
12 193500
13 280580
14 437200
15 1320000
16 1777000
17 4300000
18 5368000
The method can be used for carrying out communication by taking JavaScript as a bridge between two three-dimensional maps, and comparing the zoom level of the openlayer according to the distance between the maximum and minimum longitude and latitude points of the cecum visible area, so as to achieve the linkage effect.
The scene fusion display device under the WebGL provided by the present invention is described below, and the scene fusion display device under the WebGL described below and the scene fusion display method under the WebGL described above can be referred to each other correspondingly.
Fig. 3 is a schematic structural diagram of a scene fusion display device under WebGL provided in the present invention, and as shown in fig. 3, the scene fusion display device under WebGL includes: a fusion module 301, an embedding module 302, and a presentation module 303. The fusion module 301 is used for constructing a small scene in Unity and performing monitoring video fusion; the embedding module 302 is configured to derive a WebGL file containing monitoring video information through Unity, embed the WebGL file into a web page, acquire two-dimensional map information around a small scene, and embed the two-dimensional map information into the web page; the display module 303 is configured to establish communication with the small scene, and perform associated display on the small scene, the two-dimensional map, and the three-dimensional earth in a webpage.
The device embodiment provided in the embodiments of the present invention is for implementing the above method embodiments, and for details of the process and the details, reference is made to the above method embodiments, which are not described herein again.
The scene fusion display device under WebGL provided by the embodiment of the invention not only has real-time monitoring video fusion for a macroscopic three-dimensional earth scene, adds the fusion of a small scene to the real-time monitoring video, but also combines a two-dimensional map, and can more comprehensively know the environment condition. Through linkage of the two three-dimensional maps, an effective and timely coping scheme can be made for emergency situations in a monitored area according to the two-dimensional map.
Fig. 4 is a schematic structural diagram of an electronic device provided in the present invention, and as shown in fig. 4, the electronic device may include: a processor (processor)401, a communication Interface (communication Interface)402, a memory (memory)403 and a communication bus 404, wherein the processor 401, the communication Interface 402 and the memory 403 complete communication with each other through the communication bus 404. The processor 401 may call a logic instruction in the memory 403 to execute a scene fusion exhibition method under WebGL, where the method includes: constructing a small scene in Unity, and fusing monitoring videos; deriving a WebGL file containing monitoring video information through Unity, embedding the WebGL file into a webpage, acquiring two-dimensional map information around a small scene, and embedding the two-dimensional map information into the webpage; and establishing communication with the small scene, and performing related display on the small scene, the two-dimensional map and the three-dimensional earth in a webpage.
In addition, the logic instructions in the memory 403 may be implemented in the form of software functional units and stored in a computer readable storage medium when the software functional units are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, the present invention further provides a computer program product, where the computer program product includes a computer program stored on a non-transitory computer-readable storage medium, where the computer program includes program instructions, and when the program instructions are executed by a computer, the computer is capable of executing a scene fusion exhibition method under WebGL provided by the above methods, where the method includes: constructing a small scene in Unity, and fusing monitoring videos; deriving a WebGL file containing monitoring video information through Unity, embedding the WebGL file into a webpage, acquiring two-dimensional map information around a small scene, and embedding the two-dimensional map information into the webpage; and establishing communication with the small scene, and performing related display on the small scene, the two-dimensional map and the three-dimensional earth in a webpage.
In still another aspect, the present invention further provides a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program is implemented by a processor to execute a scene fusion exhibition method under WebGL provided in the foregoing embodiments, and the method includes: constructing a small scene in Unity, and fusing monitoring videos; deriving a WebGL file containing monitoring video information through Unity, embedding the WebGL file into a webpage, acquiring two-dimensional map information around a small scene, and embedding the two-dimensional map information into the webpage; and establishing communication with the small scene, and performing related display on the small scene, the two-dimensional map and the three-dimensional earth in a webpage.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A scene fusion display method under WebGL is characterized by comprising the following steps:
constructing a small scene in Unity, and fusing monitoring videos;
deriving a WebGL file containing monitoring video information through Unity, embedding the WebGL file into a webpage, acquiring two-dimensional map information around a small scene, and embedding the two-dimensional map information into the webpage;
and establishing communication with the small scene, and performing related display on the small scene, the two-dimensional map and the three-dimensional earth in a webpage.
2. The method for scene fusion presentation under WebGL of claim 1, wherein the establishing communication with a small scene comprises:
and communicating with the small and medium-sized Unity scenes by a SendMessage method.
3. The method for fusion display of scenes under WebGL of claim 1, wherein after acquiring two-dimensional map information around a small scene, further comprising establishing an automatic roaming rule according to the two-dimensional map;
correspondingly, after the associated display is performed in the webpage, the method further comprises the following steps:
and displaying the scene according to the roaming rule.
4. The method for scene fusion presentation under WebGL of claim 1, wherein after performing association presentation in a webpage, the method further comprises:
and receiving a manual roaming rule planned according to the two-dimensional map, and displaying the scene according to the manual roaming rule.
5. The method for fusion display of scenes under WebGL of claim 1, wherein the association display of small scenes, two-dimensional maps and three-dimensional earth in a webpage comprises:
determining the longitude and latitude of a central point according to the longitude and latitude of the three-dimensional earth visible area;
and synchronizing the longitude and latitude of the central point to the longitude and latitude of the central point of the two-dimensional map.
6. The method for scene fusion exhibition under WebGL of claim 5, wherein the synchronizing of the longitude and latitude of the center point to the longitude and latitude of the center point of the two-dimensional map comprises:
synchronizing the longitude and the latitude of the central point of the two-dimensional map to the longitude and the latitude of the central point of the two-dimensional map by a view.
7. The method for fusion display of scenes under WebGL of claim 1, wherein the association display of small scenes, two-dimensional maps and three-dimensional earth in a webpage comprises:
determining a maximum longitude and latitude point and a minimum longitude and latitude point of a visible area in a three-dimensional earth scene;
and obtaining the distance between the two points, comparing the obtained distance with the distance of each scale, and determining the zoom level as the zoom level of the two-dimensional map when the distance of the scale is larger than the distance between the two points.
8. A scene fusion display device under WebGL is characterized by comprising:
the fusion module is used for constructing a small scene in Unity and carrying out monitoring video fusion;
the embedded module is used for exporting a WebGL file containing monitoring video information through Unity, embedding the WebGL file into a webpage, acquiring two-dimensional map information around a small scene and embedding the two-dimensional map information into the webpage;
and the display module is used for establishing communication with the small scene and carrying out related display on the small scene, the two-dimensional map and the three-dimensional earth in a webpage.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the scene fusion rendering method under WebGL of any one of claims 1 to 7 when executing the program.
10. A non-transitory computer readable storage medium having a computer program stored thereon, wherein the computer program, when being executed by a processor, implements the steps of the scene fusion rendering method under WebGL of any one of claims 1 to 7.
CN202011378932.1A 2020-11-30 2020-11-30 Scene fusion display method and device under WebGL Active CN112445995B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011378932.1A CN112445995B (en) 2020-11-30 2020-11-30 Scene fusion display method and device under WebGL

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011378932.1A CN112445995B (en) 2020-11-30 2020-11-30 Scene fusion display method and device under WebGL

Publications (2)

Publication Number Publication Date
CN112445995A true CN112445995A (en) 2021-03-05
CN112445995B CN112445995B (en) 2024-02-13

Family

ID=74739060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011378932.1A Active CN112445995B (en) 2020-11-30 2020-11-30 Scene fusion display method and device under WebGL

Country Status (1)

Country Link
CN (1) CN112445995B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114928718A (en) * 2022-04-29 2022-08-19 厦门图扑软件科技有限公司 Video monitoring method and device, electronic equipment and storage medium
CN116309940A (en) * 2023-03-22 2023-06-23 浪潮智慧科技有限公司 Map information display method, equipment and medium based on animation popup window assembly
CN117119148A (en) * 2023-08-14 2023-11-24 中南民族大学 Visual evaluation method and system for video monitoring effect based on three-dimensional scene

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018121699A1 (en) * 2016-12-29 2018-07-05 中兴通讯股份有限公司 Video communication method, device and terminal
CN111104622A (en) * 2019-11-29 2020-05-05 武汉虹信技术服务有限责任公司 WEBGL-based three-dimensional GIS intelligent monitoring method and device
CN111274337A (en) * 2019-12-31 2020-06-12 北方信息控制研究院集团有限公司 Two-dimensional and three-dimensional integrated GIS system based on live-action three-dimension
CN111815787A (en) * 2020-07-13 2020-10-23 北京优锘科技有限公司 Three-dimensional digital plan making system and method for petrochemical enterprises
CN111836012A (en) * 2020-06-28 2020-10-27 航天图景(北京)科技有限公司 Video fusion and video linkage method based on three-dimensional scene and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018121699A1 (en) * 2016-12-29 2018-07-05 中兴通讯股份有限公司 Video communication method, device and terminal
CN111104622A (en) * 2019-11-29 2020-05-05 武汉虹信技术服务有限责任公司 WEBGL-based three-dimensional GIS intelligent monitoring method and device
CN111274337A (en) * 2019-12-31 2020-06-12 北方信息控制研究院集团有限公司 Two-dimensional and three-dimensional integrated GIS system based on live-action three-dimension
CN111836012A (en) * 2020-06-28 2020-10-27 航天图景(北京)科技有限公司 Video fusion and video linkage method based on three-dimensional scene and electronic equipment
CN111815787A (en) * 2020-07-13 2020-10-23 北京优锘科技有限公司 Three-dimensional digital plan making system and method for petrochemical enterprises

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114928718A (en) * 2022-04-29 2022-08-19 厦门图扑软件科技有限公司 Video monitoring method and device, electronic equipment and storage medium
CN116309940A (en) * 2023-03-22 2023-06-23 浪潮智慧科技有限公司 Map information display method, equipment and medium based on animation popup window assembly
CN116309940B (en) * 2023-03-22 2023-11-24 浪潮智慧科技有限公司 Map information display method, equipment and medium based on animation popup window assembly
CN117119148A (en) * 2023-08-14 2023-11-24 中南民族大学 Visual evaluation method and system for video monitoring effect based on three-dimensional scene
CN117119148B (en) * 2023-08-14 2024-02-02 中南民族大学 Visual evaluation method and system for video monitoring effect based on three-dimensional scene

Also Published As

Publication number Publication date
CN112445995B (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CN112445995B (en) Scene fusion display method and device under WebGL
US20200090303A1 (en) Method and device for fusing panoramic video images
Kikuchi et al. Future landscape visualization using a city digital twin: Integration of augmented reality and drones with implementation of 3D model-based occlusion handling
CN108961417B (en) Method and device for automatically generating space size in three-dimensional house model
CN111031293B (en) Panoramic monitoring display method, device and system and computer readable storage medium
CN110889824A (en) Sample generation method and device, electronic equipment and computer readable storage medium
CN107084740B (en) Navigation method and device
US20190378289A1 (en) Determining Size Of Virtual Object
CN112508071B (en) BIM-based bridge disease marking method and device
CN108961423B (en) Virtual information processing method, device, equipment and storage medium
CN110288692B (en) Illumination rendering method and device, storage medium and electronic device
CN115546377B (en) Video fusion method and device, electronic equipment and storage medium
CN113516666A (en) Image cropping method and device, computer equipment and storage medium
CN114928718A (en) Video monitoring method and device, electronic equipment and storage medium
CN113298130B (en) Method for detecting target image and generating target object detection model
CN109491565B (en) Method and equipment for displaying component information of object in three-dimensional scene
US20180121729A1 (en) Segmentation-based display highlighting subject of interest
TWI705692B (en) Information sharing method and device in three-dimensional scene model
CN113989442B (en) Building information model construction method and related device
CN110381353A (en) Video scaling method, apparatus, server-side, client and storage medium
CN111787081B (en) Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform
CN114417452A (en) Method for processing building information model and related device
CN112825198B (en) Mobile tag display method, device, terminal equipment and readable storage medium
CN112465987A (en) Navigation map construction method for three-dimensional reconstruction of visual fusion information
CN114061593A (en) Navigation method based on building information model and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant