CN111083445A - Campus three-dimensional prevention and control system - Google Patents

Campus three-dimensional prevention and control system Download PDF

Info

Publication number
CN111083445A
CN111083445A CN201911375473.9A CN201911375473A CN111083445A CN 111083445 A CN111083445 A CN 111083445A CN 201911375473 A CN201911375473 A CN 201911375473A CN 111083445 A CN111083445 A CN 111083445A
Authority
CN
China
Prior art keywords
access server
augmented reality
data
video
code stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911375473.9A
Other languages
Chinese (zh)
Inventor
朱建永
王志敏
于静伟
张晓东
宋伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Julong Educational Technology Network Co ltd
Original Assignee
Shenzhen Julong Educational Technology Network Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Julong Educational Technology Network Co ltd filed Critical Shenzhen Julong Educational Technology Network Co ltd
Priority to CN201911375473.9A priority Critical patent/CN111083445A/en
Publication of CN111083445A publication Critical patent/CN111083445A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses a campus three-dimensional prevention and control system which is characterized by comprising a configuration management client, a video access server, a data access server, an augmented reality client and a plurality of video devices, wherein the configuration management client is connected with the video access server; the configuration management client is used for configuring and storing parameters of the enhanced video equipment, the video access server and the data access server; the video equipment is used for collecting real-time code streams; the video access server is connected with the video equipment and used for distributing the code stream to the augmented reality client; the data access server is used for accessing the third-party information system and receiving data sent by the third-party information system; and the augmented reality client is respectively connected with the video access server and the data access server. The addition of the augmented reality technology enables the data of the video equipment and the data of the third-party information system to be comprehensively and dynamically displayed on the large-screen command and dispatch desk in real time, so that a three-dimensional prevention and control system is formed.

Description

Campus three-dimensional prevention and control system
Technical Field
The invention relates to the field of campus security, in particular to a campus three-dimensional prevention and control system.
Background
The 21 st century human society has entered the information age, and with the rapid development of multimedia technology and network technology, people pursued safer and more comfortable life style. At present, the emerging digital and intelligent application system is a necessary product of the information era, is a skillful combination of high technology and era development, takes a local building as a platform, is compatible with intelligent equipment, automation and network systems, is an optimized combination integrating safety, service, management and the like, and provides a safe, efficient, comfortable and convenient living environment for people. The concept of a digital and intelligent application system is permeating into a plurality of fields, wherein the digital campus monitoring system is expanded from the basic meaning of the intelligent application system, the digital campus monitoring system serves teachers and students, and the digital campus monitoring system serves for providing a comfortable and safe learning environment for the students. Under the influence of global digital wave, the construction of digital safe campuses is widely regarded, and schools all over the country are coming into contact with the theme of digital campus construction from all aspects by virtue of the development of current technologies. In recent years, schools in various regions are continuously stolen, robbed, cheated, tied and various violent incidents, and recently, many vicious criminal incidents aiming at the schools occur in middle and primary schools and kindergartens in China, which is particularly striking. How to strengthen the safety precaution management of the campus becomes a problem of important consideration for school managers and public security organs. An advanced, effective and high-integration video monitoring system is built in the school range, and a schedule is provided. The digital video monitoring system is popularized in schools all over the country because of integrating multiple functions of security prevention, anti-theft alarm, examination room monitoring, courseware making, comprehensive management and the like. The digital video monitoring system not only ensures the personal safety of teachers and students in the majority of schools and reduces the occurrence of various public security events, but also can be a powerful tool for effectively managing classrooms and students by managers at all levels of schools. Therefore, a campus stereoscopic prevention and control system is provided for a campus.
Disclosure of Invention
The invention aims to provide a modern networked video monitoring solution for a manager through a campus stereo prevention and control system, realize advanced remote networked video monitoring, realize a multi-level, large-scale, complete and reliable security management function, effectively monitor the security operation, simultaneously take the process of preventing sudden accidents in an area into a lens, provide effective image evidence for accident prevention, security management and unified command, and enable management departments at all levels to master the real-time security operation condition at any time so as to make a correct decision.
In order to solve the problems, the invention provides a three-dimensional prevention and control system which comprises a configuration management client, a video access server, a data access server, an augmented reality client and a plurality of video devices, wherein the configuration management client is connected with the video access server;
the configuration management client is used for configuring and storing parameters of the enhanced video equipment, the video access server and the data access server;
the video equipment is used for acquiring the real-time code stream, wherein part of the video equipment is also used for adding augmented reality information in the real-time code stream to form an augmented reality real-time code stream;
the video access server is connected with the video equipment and used for distributing the code stream to the augmented reality client;
the data access server is used for accessing the third-party information system and receiving data sent by the third-party information system;
and the augmented reality client is respectively connected with the video access server and the data access server and is used for integrating and presenting the code stream sent by the video access server and the data of the third-party information system sent by the data access server.
Preferably, the video access server accesses the video device through 28281 protocol or through SDK side
The video device is accessed.
Preferably, the data access server provides an active access data service and a passive access data service, and third
The party information system is accessed through an active access data service or a passive access data service.
Preferably, the active access data service is provided by a third party information system through an SDK, an API interface or data
The library is accessed to a third-party information system; the passive access data service provides an Http API interface, and the third-party information system sends data to the data access server through the Http API interface in an active mode.
Preferably, the video device includes a common camera for collecting the real-time code stream and an augmented reality camera for collecting the real-time code stream and adding augmented reality information to the real-time code stream to form an augmented reality real-time code stream.
Preferably, the augmented reality information includes one or more of a tag, longitude and latitude, altitude, and satellite clock
And (4) a plurality of.
Preferably, the tag is configured with an augmented reality client to obtain augmented reality from a video access server
And the camera and the common camera are used for real-time code stream parameters.
Preferably, the augmented reality client accesses the code stream and the data sent by the video access server to the server
The method for carrying out integration presentation on the sent data of the third-party information system comprises the following steps:
and integrating the data of the code stream and/or the third-party information system with the static label information in the augmented reality real-time code stream, and merging, decoding and playing the data acquired from the data access server and the augmented reality real-time code stream during decoding and playing.
Preferably, the augmented reality client accesses the code stream and the data sent by the video access server to the server
The method for carrying out integration presentation on the sent data of the third-party information system comprises the following steps:
integrating a third-party information system with GPS position information with the augmented reality real-time code stream, calculating the position of the third-party information system in a video picture through the GPS information sent by the third-party information system and the GPS information of an augmented reality camera carried in the augmented reality real-time code stream, and merging, decoding and playing the data sent by the third-party information system and the augmented reality real-time code stream during decoding and playing.
Preferably, the augmented reality client accesses the code stream and the data sent by the video access server to the server
The method for carrying out integration presentation on the sent data of the third-party information system comprises the following steps:
the URL address and the used inner core of the third-party information system are configured through the configuration management client, and when the augmented reality client calls the third-party information system, the configured inner core is used for opening the URL address corresponding to the third-party information system and linking to the third-party information system.
The invention has the beneficial effects that:
1. the video equipment is used for collecting the real-time code stream, the augmented reality information can be added into the real-time code stream to form the augmented reality real-time code stream, the augmented reality client can integrate and present data with the same third party information and the code stream of the video equipment, and extracts resources such as running data, alarm information, control information and the like which are valuable to on-site command and dispatch, and the addition of the augmented reality technology enables the data of the video equipment and the data of a third party information system to be comprehensively and dynamically displayed on a large-screen command and dispatch table in real time, so that a three-dimensional prevention and control system is formed.
2. The data access server of the invention provides active access and passive access services, covers SDK interfaces, various APIs, various database interfaces, Http API interfaces and the like, and has strong information integration capability.
Drawings
Fig. 1 is an architecture diagram of a campus stereo defense and control system according to an embodiment of the present invention.
Detailed Description
The invention will be further elucidated and described with reference to the embodiments and drawings of the specification:
fig. 1 is a schematic diagram of a stereo defense system according to an embodiment of the present invention. Referring to fig. 1, the stereoscopic arming system in this embodiment specifically includes a configuration management client 101, a video access server 102, a data access server 103, an augmented reality client 104, and a plurality of video devices 105;
and the configuration management client 101 is used for configuring and saving parameters of the video equipment 105, the video access server 102 and the data access server 103. The configuration management client 101 is mainly used for configuring parameters of the system, such as providing names and access parameters of the video devices 105, configuring access parameters of the video access server 102 and the data access server 103, such as adding cameras for security video monitoring, configuring service addresses of face recognition and investigation services, and the like.
The video device 105 is configured to collect a real-time code stream, wherein a portion of the video device 105 is further configured to add augmented reality information to the real-time code stream to form an augmented reality real-time code stream. In a specific implementation, the video device 105 includes a normal camera and an augmented reality camera; the common camera is used for collecting the real-time code stream, and the augmented reality camera is used for collecting the real-time code stream and adding augmented reality information in the real-time code stream to form the augmented reality real-time code stream. In this scenario, the video device 105 may be a stand-alone device, or may be embedded in other products, such as a patrol robot with a camera, a balance car, an unmanned aerial vehicle, and the like.
And the video access server 102 is used for accessing the video equipment. The video access server accesses the video equipment through 28281 protocol or accesses the video equipment of each manufacturer through SDK mode. After the video device 105 is accessed through the video access server 102, the video stream data is uploaded to the augmented reality client 104 through the video access server 102 in real time. The video access server 102 provides streaming media service at the same time, and supports multiple clients to connect.
And the data access server 103 is used for accessing the third-party information system and receiving data uploaded by the third-party information system. In a specific implementation process, the data access server 103 provides an active access data service and a passive access data service, and the third-party information system performs access through the active access data service or the passive access data service. The active access data service is accessed to the third-party information system through SDK, API interface or database provided by the third-party information system, the passive access data service is provided with Http API interface by the data access server 103, and the third-party information system that needs to access the stereoscopic prevention and control system actively sends data to the data access server 103 through Http API interface. The data access server 103 sends a message queue corresponding to the upload data of the third-party information system to the subscribed augmented reality client 105 in a subscription mode of the augmented reality client 105. And the augmented reality client 104 is connected with the video device 105 and the data access server 103 respectively, and is used for integrating and presenting the real-time code stream sent by the video device 105 and the data of the third-party information system sent by the data access server 102. The augmented reality client 104 may also perform decoding and playing after receiving the augmented reality real-time code stream sent by the video device 105.
In a specific implementation process, augmented reality information added to the augmented reality camera includes one or more of a tag, longitude and latitude, altitude and a satellite clock. In a preferred embodiment, the tag is configured with parameters of the augmented reality client 104 acquiring real-time code streams of the augmented reality camera and the normal camera from the video access server 102.
The way for integrating and presenting the code stream sent by the video access server 102 and the data of the third-party information system sent by the data access server 103 by the augmented reality client 104 includes: and integrating the data of the code stream and/or the third-party information system with the static label information in the augmented reality real-time code stream, and merging, decoding and playing the data acquired from the data access server and the augmented reality real-time code stream during decoding and playing. And when the decoding and playing are carried out, the collected information such as human faces, license plates and the like can be displayed in real time. The method dynamically displays the information such as the faces, license plates and the like collected by the subsystems in the large-screen commanding and dispatching system, commanders do not need to independently check related information on each subsystem, all important information can be visually displayed by combining with the current overall monitoring video picture, and the working efficiency is greatly improved. The static label belongs to one of labels, and the labels comprise a static label and a dynamic label.
The manner of integrating the presentations also includes the following two manners: integrating a third-party information system with GPS position information with the augmented reality real-time code stream, calculating the position of the third-party information system in a video picture through the GPS information sent by the third-party information system and the GPS information of an augmented reality camera carried in the augmented reality real-time code stream, and merging, decoding and playing the data sent by the third-party information system and the augmented reality real-time code stream during decoding and playing. And when the decoding and playing are carried out, the collected information such as human faces, license plates and the like can be displayed in real time. In the mode, data sent by a third-party information system with GPS position information can be presented in real time in an augmented reality video picture. The third-party information system with the GPS position information can be a mobile phone, a handheld terminal, a vehicle and the like. The method comprises the steps of uploading relevant information with GPS position information equipment, such as a mobile phone, a handheld terminal, a vehicle and the like, and displaying the relevant information in an augmented reality video picture in real time, configuring the URL address and the used kernel of a third-party information system through a configuration management client 101, and opening the URL address corresponding to the third-party information system by using the configured kernel and linking the URL address to the third-party information system when the augmented reality client 104 calls the third-party information system. The integration of the mode realizes the rapid linkage of people stream monitoring and other systems through the embedded IE kernel and the Webkit kernel.
The system can integrate information such as face control, vehicle control, intelligent analysis control, mobile police strength and the like, when abnormal alarm occurs, the system can automatically pop up alarm information, automatically and quickly control the cradle head to turn to an alarm occurring area, visually display the alarm information in a large-screen commanding and dispatching system, control local information on site in real time and improve the efficiency of commanding and dispatching.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the protection scope of the present invention, although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A campus stereoscopic prevention and control system is characterized by comprising a configuration management client, a video access server, a data access server, an augmented reality client and a plurality of video devices;
the configuration management client is used for configuring and storing parameters of the enhanced video equipment, the video access server and the data access server;
the video equipment is used for acquiring the real-time code stream, wherein part of the video equipment is also used for adding augmented reality information in the real-time code stream to form an augmented reality real-time code stream;
the video access server is connected with the video equipment and used for distributing the code stream to the augmented reality client;
the data access server is used for accessing the third-party information system and receiving data sent by the third-party information system;
and the augmented reality client is respectively connected with the video access server and the data access server and is used for integrating and presenting the code stream sent by the video access server and the data of the third-party information system sent by the data access server.
2. The campus stereo defense and control system of claim 1, wherein the video access server accesses the video device via 28281 protocol or SDK.
3. The campus stereo defense and control system according to claim 1, wherein the data access server provides active access data services and passive access data services, and the third party information system is accessed through the active access data services or the passive access data services.
4. The campus stereoscopic prevention and control system of claim 3, wherein the active access data service is accessed to the third party information system through an SDK, an API interface or a database provided by the third party information system;
the passive access data service provides an Http API interface, and the third-party information system sends data to the data access server through the Http API interface in an active mode.
5. The campus stereoscopic prevention and control system of claim 1, wherein the video device comprises a normal camera for collecting the real-time code stream and an augmented reality camera for collecting the real-time code stream and adding augmented reality information to the real-time code stream to form an augmented reality real-time code stream.
6. The campus stereoscopic prevention and control system of claim 5, wherein the augmented reality information comprises one or more of tags, latitude and longitude, altitude, satellite clocks.
7. The campus stereoscopic prevention and control system of claim 6, wherein the tag is configured with parameters for the augmented reality client to obtain real-time code streams of the augmented reality camera and the normal camera from the video access server.
8. The campus stereo defense and control system according to claim 7, wherein the way of integrating and presenting the code stream sent from the video access server and the data of the third-party information system sent from the data access server by the augmented reality client comprises: and integrating the data of the code stream and/or the third-party information system with the static label information in the augmented reality real-time code stream, and merging, decoding and playing the data acquired from the data access server and the augmented reality real-time code stream during decoding and playing.
9. The campus stereo defense and control system according to any one of claims 1 to 8, wherein the manner of integrating and presenting the code stream sent from the video access server and the data of the third-party information system sent from the data access server by the augmented reality client includes:
and integrating a third-party information system with GPS position information (with the augmented reality real-time code stream, calculating the position of the third-party information system in a video picture through the GPS information sent by the third-party information system and the GPS information of an augmented reality camera carried in the augmented reality real-time code stream, and merging, decoding and playing the data sent by the third-party information system and the augmented reality real-time code stream during decoding and playing.
10. The campus stereo defense and control system according to any one of claims 1 to 8, wherein the manner of integrating and presenting the code stream sent from the video access server and the data of the third-party information system sent from the data access server by the augmented reality client includes:
the URL address and the used inner core of the third-party information system are configured through the configuration management client, and when the augmented reality client calls the third-party information system, the configured inner core is used for opening the URL address corresponding to the third-party information system and linking to the third-party information system.
CN201911375473.9A 2019-12-27 2019-12-27 Campus three-dimensional prevention and control system Pending CN111083445A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911375473.9A CN111083445A (en) 2019-12-27 2019-12-27 Campus three-dimensional prevention and control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911375473.9A CN111083445A (en) 2019-12-27 2019-12-27 Campus three-dimensional prevention and control system

Publications (1)

Publication Number Publication Date
CN111083445A true CN111083445A (en) 2020-04-28

Family

ID=70318402

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911375473.9A Pending CN111083445A (en) 2019-12-27 2019-12-27 Campus three-dimensional prevention and control system

Country Status (1)

Country Link
CN (1) CN111083445A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111770181A (en) * 2020-06-29 2020-10-13 重庆紫光华山智安科技有限公司 Access method, device and equipment of Internet of things equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111770181A (en) * 2020-06-29 2020-10-13 重庆紫光华山智安科技有限公司 Access method, device and equipment of Internet of things equipment
CN111770181B (en) * 2020-06-29 2022-11-11 重庆紫光华山智安科技有限公司 Access method, device and equipment of Internet of things equipment

Similar Documents

Publication Publication Date Title
US11943693B2 (en) Providing status of user devices during a biological threat event
US10178537B2 (en) Emergency messaging system and method of responding to an emergency
US8711732B2 (en) Synthesized interoperable communications
US20130141460A1 (en) Method and apparatus for virtual incident representation
CN111612933A (en) Augmented reality intelligent inspection system based on edge cloud server
CN108965825A (en) Video interlink dispatching method based on holographic situational map
US20050273330A1 (en) Anti-terrorism communications systems and devices
US20070088553A1 (en) Synthesized interoperable communications
CN202773002U (en) Integrated visualized command and dispatch platform
CN108810462A (en) A kind of camera video interlock method and system based on location information
CN107426065B (en) Three-dimensional prevention and control system
CN111083445A (en) Campus three-dimensional prevention and control system
CN111767898B (en) Service data processing method, device, equipment and storage medium
CN102685459A (en) Ambulance vehicle-mounted third generation (3G) wireless video monitoring device and monitoring method thereof
CN103826098A (en) Forestry cloud platform based on video and positioning services
CN111583074A (en) Campus security management system and method
CN103839107A (en) Social management grid system
CN114401386B (en) Two-to-many remote e-interrogation system and method for intelligent public security
CN110855929B (en) Tax coordination command system
US20210258757A1 (en) Emergency Response Communication System
Timothy An android location-based crime reporting system using the Google Map API
KR101383583B1 (en) Remote supervisory method for scene of accident, management method for remote supervisory system and remote supervisory realization method for application operated on mobile terminal
CN113573025A (en) Monitoring video viewing method and device, terminal equipment and storage medium
CN111405244A (en) Online remote video wisdom community correction system
CN112911212A (en) Scenic spot command system and method based on big data processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200428